FUNCTIONS OF ONE RANDOM VARIABLE

Similar documents
p. 6-1 Continuous Random Variables p. 6-2

Chapter 2: Random Variables

Probability and Distributions

Northwestern University Department of Electrical Engineering and Computer Science

Solution to Assignment 3

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 3: Random Variables 1

Brief Review of Probability

2 (Statistics) Random variables

SDS 321: Introduction to Probability and Statistics

1 Joint and marginal distributions

Multiple Random Variables

Lecture 3: Random variables, distributions, and transformations

2 Functions of random variables

Multivariate Distribution Models

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Continuous random variables

Will Landau. Feb 21, 2013

Math 480 The Vector Space of Differentiable Functions

where r n = dn+1 x(t)

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

System Identification

MTH 202 : Probability and Statistics

Review: mostly probability and some statistics

MATH4210 Financial Mathematics ( ) Tutorial 7

Random Variables and Their Distributions

3 Operations on One Random Variable - Expectation

STA 256: Statistics and Probability I

Chapter 3: Random Variables 1

Transformation of Probability Densities

f (x) = k=0 f (0) = k=0 k=0 a k k(0) k 1 = a 1 a 1 = f (0). a k k(k 1)x k 2, k=2 a k k(k 1)(0) k 2 = 2a 2 a 2 = f (0) 2 a k k(k 1)(k 2)x k 3, k=3

Chap 2.1 : Random Variables

Continuous Random Variables and Continuous Distributions

Random Variables. P(x) = P[X(e)] = P(e). (1)

Continuous distributions

conditional cdf, conditional pdf, total probability theorem?

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Lecture 5: Integrals and Applications

Math 152 Take Home Test 1

1 Presessional Probability

Metric Spaces. Exercises Fall 2017 Lecturer: Viveka Erlandsson. Written by M.van den Berg

Grade: The remainder of this page has been left blank for your workings. VERSION D. Midterm D: Page 1 of 12

Parallel Circuits. Chapter

4 Pairs of Random Variables

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Basics of Stochastic Modeling: Part II

ELEMENTS OF PROBABILITY THEORY

STAT Chapter 5 Continuous Distributions

ECE 302 Solution to Homework Assignment 5

Continuous Random Variables

Continuous Random Variables

Lecture 4: Fourier Transforms.

Mathematical MCQ for international students admitted to École polytechnique

Section 5.5 More Integration Formula (The Substitution Method) 2 Lectures. Dr. Abdulla Eid. College of Science. MATHS 101: Calculus I

Lecture 4. Continuous Random Variables and Transformations of Random Variables

General Random Variables

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

Statistics, Data Analysis, and Simulation SS 2015

Gaussian Random Fields

Lecture 11. Probability Theory: an Overveiw

3 Continuous Random Variables

Mathematics 1. (Integration)

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Lecture 4: Integrals and applications

2.1 Random Events and Probability

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

1 Introduction. 2 Measure theoretic definitions

Integration by Substitution

Order Statistics and Distributions

Iowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions

Lecture 8: Continuous random variables, expectation and variance

Poisson random measure: motivation

1 Random Variable: Topics

Fundamental Tools - Probability Theory II

Normal Random Variables and Probability

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

1.1 Review of Probability Theory

DUBLIN CITY UNIVERSITY

Product measure and Fubini s theorem

Singular Integrals. 1 Calderon-Zygmund decomposition

Exam 3 Solutions. Multiple Choice Questions

Proving the central limit theorem

Stat410 Probability and Statistics II (F16)

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

F (x) = P [X x[. DF1 F is nondecreasing. DF2 F is right-continuous

Bounded uniformly continuous functions

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Ch3 Operations on one random variable-expectation

STAT 450: Statistical Theory. Distribution Theory. Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6.

Chapter 4. Continuous Random Variables 4.1 PDF

Probability Theory and Statistics. Peter Jochumzen

MATH115. Infinite Series. Paolo Lorenzo Bautista. July 17, De La Salle University. PLBautista (DLSU) MATH115 July 17, / 43

Mathematical Methods for Physics and Engineering

1.1. BASIC ANTI-DIFFERENTIATION 21 + C.

STA 4321/5325 Solution to Homework 5 March 3, 2017

Unsteady State Heat Conduction in a Bounded Solid

STAT 450: Statistical Theory. Distribution Theory. Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6.

Introduction to Probability Theory

Transcription:

FUNCTIONS OF ONE RANDOM VARIABLE Constantin VERTAN April 9th, 05 As shown before, any random variable is a function, ξ : Ω R; this function can be composed with any real-valued, real argument function g : R R, resulting another random variable, denoted, for instance as η, which is: η g ξ gξ, η : Ω R. The following problem occurs?: how to characterize the new random variable η more specifically, how to find the probability density function of the random variable η knowing the statistical properties of the random variable ξ. Let s consider a given, fixed value x. The probability that the value of a particular realization of the random variable ξ s a is equal to x equals the probability that the value of that particular realization of the random variable ξ is within the infinitely small interval [x, x + dx] dx 0. But this is also: Probξ [x; x + dx]} F ξ x + dx F ξ x x+dx x w ξ tdt w ξ x dx. If the transform function g is bijective, the value x is uniquely mapped to the value y gx. in the same as we deduced, we can write that: Probη [y; y + dy]} w η y dy. But since y is uniquely obtained out of x, it follows that the probabilities given by and are equal, thus: w ξ x dx w η y dy. We are interested in w η y, so we will write: w η y w ξ x dx dy w ξx g x xg y w ξ g y g g y. 3 This formula 3 is valed only in the case of a transform function g which is bijective over its entire domain, or, in some other words, only if the equation y gx in which x is unknown and y is a parameter, has an unique solution. If the function g is not bijective, its domain must be decomposed in intervals of bijectivity. Within each such interval, the equation y gx, in which x is unknown and y is a parameter, has an unique solution denoted by x k. In this case, the formula 3 becomes: w η y k w ξ x k g x k xk g y. 4 The essential condition for the above relation to hold is that the number of intervals of bijectivity is finite or countable. This document is intended to be used in the framework of the course Decision and estimation for information processing, held for the students of the English stream at the Faculty of Electronics, Telecommunications and Information Technology of the Politehnica University of Bucharest. This material discusses applications of the fundamental theoretical definitions regarding the concept of random variable, is intended for use at the seminars and does not replaces the course notes.

For the pair of random variables ξ and η one can verify the theorem of the mean: η yw η ydy yw ξ xdx gxw ξ xdx. 5 Ex. Let ξ be a random variable uniformly distributed within π, π and consider the function g : π, π ;, with gx sinx. The random variable η is given by η gξ. Compute the pdf of the random variable η. First, one has to check the bijectivity of the transform function gx within its domain, and, if this does not hold, one has to divide this interval into subintervals for which the function is bijective. The bijectivity can be studied in a very simple manner, by solving the equation y gx, with the unknown x and the parameter y. In this case, the equation y sinx has one unique solution for x π, π, which is x arcsiny. Thus g y arcsiny, g : ; π, π. The derivative of the function gx is g x cosx. According to the formula of computing the new pdf 3 we have: f η y w ξ g y g g y w ξarcsiny cosarcsiny f ξarcsiny. y The random variable ξ is uniformly distributed within π, π ; then it follows that w ξ x π, if x π, π, Then: w η y π, if arcsiny π, π y 0, otherwise y π, if y, 0, otherwise Ex. A random variable ξ is transformed via a linear function gx x + β 0, into the random variable η. Compute the pdf, the mean and the variance of the random variable η, considering that ξ is distributed according to a normal distribution, and respectively according to an uniform distribution. Any linear function is bijective; the equation y gx with the unknown x has the solution x y β, and thus g y y β. The derivative of the linear function is g x. Under these circumstances, the pdf of the random variable η is given by the formula 3, being: w η y w ξ g y w y β ξ g g y y β w ξ. 6 If ξ is normally distributed with mean µ and variance σ then: w ξ x Nµ, σ exp πσ By replacing in 6 we get: w η y exp πσ y β x µ σ µ σ. π σ exp y µ + β π σ N µ + β, σ. This means that the distribution of the random variable obtained by the linear transformation is still normal, having a mean transformed via the same linear function and a variance scaled times.

If the random variable ξ is uniformly distributed within [a; b], for instance, its pdf is: w ξ x b a, if x [a, b], Then: w η y w ξ y β b a y β, if [a, b], One has to consider two cases, according to the sign of ; case, > 0: w η y b a, if y [a + β, b + β], case, < 0: w η y b a, if y [b + β, a + β], In both cases one can notice that the distribution is still uniform after the transform and according to the previous proof, the mean is the middle of the definition interval, and the variance is / from the squared interval length: η a + b + β ξ + β and ση b a σ ξ. The new mean is the original mean transformed via the same function as the random variable and the new variance is the scaled version of the original variance. Ex. 3 One considers the random variable ξ, uniformly distributed in [ c, c]. Compute the pdf and the cumulative density function of the random variable η /ξ. The transform function is gx /x. The derivative is g x /x 3. The function is not bijective over the entire real axis, but it is bijective within the intervals, 0 and 0,. The solutions of the equation y gx are: x / y and x / y, for y > 0. If y < 0 the equation has no solutions and f η y 0. Then we can apply the formula 4 for y > 0: w η y w ξ x k g x k xk g y w ξ x g x x g y + k +w ξ x g x xg y w η y w ξ/ y / / y 3 + w ξ / y / / y 3 y 3 wξ / y + w ξ / y. The random variable ξ is uniformly distributed; then: w ξ x c, if x [ c, c], It follows that: w ξ / y c, if y, /c] [/c,, c, if y [/c,, w ξ / y c, if y, /c] [/c,, c, if y [/c,, 3

Then: w η y c y 3, if y [/c,, The cumulative density function of random variable η is: y 0, if y < /c, F η y w η tdt y, if y [/c,. Ex. 4 A signal with values distributed normally according to N0, σ zero mean and variance σ is rectified by a ideal diode. Compute the pdf of the values of the resulting signal. The transform function of the rectifier circuit containing a single diode half-wave rectifier is: x, if x 0, gx 0, if x < 0. The function gx is not bijective; in this case one cannot simply apply formula 4 sice the solution set for the equation y gx with x < 0 is not countable; more precisely x R gx 0}, 0]. The problem will be solved by computing the cumulative density function F η y Probη y}. If y < 0, F η y Probη y} Probη < 0} 0. If y 0, F η y Probη 0} Probη 0} Probξ 0} F ξ 0 0.5. If y > 0, F η y Probη y} Probξ x} F ξ x. In concluzie, Fξ y, if y 0, F η y The desired pdf is the derivative of F η y, that is: w η y df ηy dy 0, if y < 0, 0.5δy + N0, σ Uy, for y 0., where U is the unitary step function. One has to notice that the new random variable η has a concentrated probabiliy at value zero, that is the probability of obtaining exactly the value zero is non-zero: P η 0} lim ε 0 ε ε w η ydy lim ε 0 ε ε δydy + lim ε 0 ε 0 w ξ xdx. Ex. 5 Find the transform function that maps an uniform distribution within [0; ] into a Rayleigh distribution. Let ξ be the uniformly distributed random variable and η be the Rayleigh distributed random variable. Then:, if x [0, ], w ξ x, y w η y e y, if y 0, The supports of the two pdf s are [0, ], respectivly [0,, and then the unknown transform function must satisfy g : [0, ] [0,. Let assume that the function g is bijective; then, according to 3 we have: w η y w ξ g y g g y. 4

The inverse of the transform function exists and it is g : [0, [0, ]. This means that f ξ g y and thus: g g y w η y. But, since g is bijective, then it is strictly monotonic. By imposing g0 0, it follows that the function cannot be otherwise than increasing, and its derivative is positive. g y wη y, g y y w η tdt y 0 t e t From here one evaluates y as a function of x and then: Thus gx ln x. y ln x. dt e y x. Ex. 6 Prove that in the case when ξ is a random variable, its cumulative density function will transform it into an uniformly distributed random variable within [0, ]. If the pdf of random variable ξ is w ξ x, then its associated cumulative density function is F ξ x x w ξ tdt. If this is also the transform function of the random variable, then gx F ξ x, with g : R [0, ]. The derivative of the transform function is: g x df ξx dx w ξ x, and g x w ξ x w ξ x since the pdf is positive. then, according to 3 we have: w η y w ξ x g x xg y w ξ x w ξ x xf ξ y for y [0; ]. This is indeed an uniform distribution within [0, ]. Ex. 7 The dissipated electrical power into a resistor R kω is modelled as a random variable having an uniform distribution within [P min, P max ] [W, 0W ]. Find the distribution of the current values within the resistor. Ex. 8 A constant valued resistor R is conected at a current generator. The current generated by the generator can be considered a random variable uniformly distributed within [I min, I max ]. Compute the mean power dispersed by the resistor and the pdf of that power. Ex. 9 Prove using the theorem of the mean 5 that for a linear transform function gx x + β the variance of the resulting random variable is the original variance scaled times, and the resulting mean is original mean transformed via gx. Ex. 0 Compute the mean information obtained following the realization of an event which probability is distributed according to a /x function within [, ]. Ex. By measuring the anodic current of a vacuum diode one notices a linear distribution within 0 and ma. The anodic voltage of the vacuum diode is generated by a voltage source that has to be tested the dispersion of the voltage must not exceed 0 V. If A /000 ma/v 3/ is the tested generator fit? Ex. The transfer function of an ideal bi-phase rectifier is described by the function gx x. Compute the pdf, the mean voltage and the mean power of the random signal ξt obtained after rectification, if ξt is a normally distributed N0, σ, b uniformly distributed in [ ; ] and [0; ]. Ex. 3 A non-linear limiter is defined by the transfer function: 0, if x 0, y gx x, if x 0, ],, if x >. 5

At the input of the circuit is applied a signal with values following the pdf w X x e x + δx. Compute the pdf of the output signal. Acknowledgement This work has been funded by the Sectoral Operational Programme Human Resources Development 007-03 of the Romanian Ministry of Education and Scientific Research through the Financial Agreement POSDRU/74/.3/S/4955. References [] M. Ciuc, C. Vertan: Statistical signal processing, in Romanian: Prelucrarea statistică a semnalelor, Ed. MatrixROM, Bucharest, 005. [] A. T. Murgan, I. Spânu, I. Gavăt, I. Sztojanov, V. E. Neagoe, A. Vlad: Exercises for the Theory of Information Transmission in Romanian: Teoria Transmisiunii Informaţiei - probleme, Ed. Didactică şi Pedagogică, Bucharest, 983. [3] C. Vertan, I. Gavăt, R. Stoian: Random variables and processes: principles and applications in Romanian: Variabile şi procese aleatoare: principii şi aplicaţii, Ed. Printech, Bucharest, 999. [4] A. Papoulis: Probability, random variables and stochastic processes, McGraw Hill Inc., 99. 6