04. Random Variables: Concepts

Size: px
Start display at page:

Download "04. Random Variables: Concepts"

Transcription

1 University of Rhode Island Nonequilibrium Statistical Physics Physics Course Materials Random Variables: Concepts Gerhard Müller University of Rhode Island, Creative Commons License This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4. License. Follow this and additional works at: nonequilibrium_statistical_physics Part of the Physics Commons Abstract Part four of course materials for Nonequilibrium Statistical Physics (Physics 626), taught by Gerhard Müller at the University of Rhode Island. Entries listed in the table of contents, but not shown in the document, exist only in handwritten form. Documents will be updated periodically as more entries become presentable. Updated with version 2 on 5/3/216. Recommended Citation Müller, Gerhard, "4. Random Variables: Concepts" (215). Nonequilibrium Statistical Physics. Paper 4. This Course Material is brought to you for free and open access by the Physics Course Materials at DigitalCommons@URI. It has been accepted for inclusion in Nonequilibrium Statistical Physics by an authorized administrator of DigitalCommons@URI. For more information, please contact digitalcommons@etal.uri.edu.

2 Contents of this Document [ntc4] 4. Random Variables: Concepts Probability distributions [nln46] Characteristic function, moments, and cumulants [nln47] Cumulants expressed in terms of moments [nex126] Generating function and factorial moments [nln48] Multivariate distributions [nln7] Transformation of random variables [nln49] Sums of independent exponentials [nex127] Propagation of statistical uncertainty [nex24] Chebyshev s inequality [nex6] Law of large numbers [nex7] Binomial, Poisson, and Gaussian distribution [nln8] Binomial to Poisson distribution [nex15] De Moivre - Laplace limit theorem [nex21] Central limit theorem [nln9] Multivariate Gaussian distribution Robust probability distributions [nex19] Stable probability distributions [nex81] Exponential distribution [nln1] Waiting time problem [nln11] Pascal distribution [nex22]

3 Probability Distribution [nln46] Experiment represented by events in a sample space: S = {A 1, A 2,...}. Measurements represented by stochastic variable: X = {x 1, x 2,...}. Maximum amount of information experimentally obtainable is contained in the probability distribution: P X (x i ), P X (x i ) = 1. Partial information is contained in moments, i X n = i x n i P X (x i ), n = 1, 2,..., or cumulants (as defined in [nln47]), X = X (mean value) X 2 = X 2 X 2 (variance) X 3 = X 3 3 X X X 3 The variance is the square of the standard deviation: X 2 = σx 2. For continuous stochastic variables we have P X (x), dxp X (x) = 1, X n = dx x n P (x). In the literature P X (x) is often named probability density and the term distribution is used for F X (x) = P X (x i ) x i <x or F X (x) = in a cumulative sense. x dx P X (x )

4 Characteristic Function [nln47] + Fourier transform: Φ X (k) =. e ikx = dx e ikx P X (x). Attributes: Φ X () = 1, Φ X (k) 1. Moment generating function: [ + ] (ik) n Φ X (k) = dx P X (x) x n (ik) n = X n n! n! X n. = + n= n= dx x n P X (x) = ( i) n dn dk Φ X(k) n. k= Cumulant generating function: ln Φ X (k) =. (ik) n X n n! n=1 X n = ( i) n dn dk ln Φ X(k) n. k= Cumulants in terms of moments (with X =. X X ): [nex126] X = X X 2 = X 2 X 2 = ( X) 2 X 3 = ( X) 3 X 4 = ( X) 4 3 ( X) 2 2 Theorem of Marcienkiewicz: ln Φ X (k) can only be a polynomial if the degree is n 2. n = 1: ln Φ X (k) = ika P X (x) = δ(x a) n = 2: ln Φ X (k) = ika 1 2 bk2 P X (x) = 1 2πb exp ( ) (x a)2 2b Consequence: any probability distribution has either one, two, or infinitely many non-vanishing cumulants.

5 [nex126] Cumulants expressed in terms of moments The characteristic function Φ X (k) of a probability distribution P X (x), obtained via Fourier transform as described in [nln47], can be used to generate the moments X n and the cumulants X n via the expansions (ik) n Φ X (k) = X n (ik) n, ln Φ X (k) = X n. n! n! n= Use these relations to express the first four cumulants in terms of the first four moments. The results are stated in [nln47]. Describe your work in some detail. n=1 Solution:

6 Generating function [nln48] The generating function G X (z) is a representation of the characteristic function Φ X (k) that is most commonly used, along with factorial moments and factorial cumulants, if the stochastic variable X is integer valued. Definition: G X (z). = z x with z = 1. Application to continuous and discrete (integer-valued) stochastic variables: G X (z) = dx z x P X (x), G X (z) = z n P X (n). n Definition of factorial moments: X m f. = X(X 1) (X m + 1), m 1; X f. =. Function generating factorial moments: (z 1) m G X (z) = X m f, X m f = dm m! dz G X(z) m. z=1 m= Function generating factorial cumulants: (z 1) m ln G X (z) = X m f, m! m=1 X m f = dm dz ln G X(z) m. z=1 Applications: Moments and cumulants of the Poisson distribution [nex16] Pascal distribution [nex22] Reconstructing probability distributions [nex14]

7 Multivariate Distributions [nln7] Let X = (X 1,..., X n ) be a random vector variable with n components. Joint probability distribution: P (x 1,..., x n ). Marginal probability distribution: P (x 1,..., x m ) = dx m+1 dx n P (x 1,..., x n ). Conditional probability distribution: P (x 1,..., x m x m+1,..., x n ). P (x 1,..., x n ) = P (x 1,..., x m x m+1,..., x n )P (x m+1,..., x n ). Moments: X m 1 1 X mn n = dx 1 dx n x m 1 1 x mn n P (x 1,..., x n ). Characteristic function: Φ(k) = e ik X. (ik 1 ) m1 (ik n ) mn Moment expansion: Φ(k) = m 1!... m n! X m 1 1 X mn n. (ik1 ) m1 (ik n ) mn Cumulant expansion: ln Φ(k) = X m 1 1 Xn mn. m 1!... m n! (prime indicates absence of term with m 1 = = m n = ). Covariance matrix: X i X j = (X i X i )(X j X j ). (i = j: variances, i j: covariances). Correlations: C(X i, X j ) = X i X j Xi X j. Statistical independence of X 1, X 2 : P (x 1, x 2 ) = P 1 (x 1 )P 2 (x 2 ). Equivalent criteria for statistical independence: all moments factorize: X m 1 1 X m 2 2 = X m 1 1 X m 2 2 ; characteristic function factorizes: Φ(k 1, k 2 ) = Φ 1 (k 1 )Φ 2 (k 2 ); all cumulants X m 1 1 X m 2 2 with m 1 m 2 vanish. If X 1 X 2 = then X 1, X 2 are called uncorrelated. This property does not imply statistical independence.

8 Transformation of Random Variables [nln49] Consider two random variables X and Y that are functionally related: Y = F (X) or X = G(Y ). If the probability distribution for X is known then the probability distribution for Y is determined as follows: P Y (y) y = dxp X (x) P Y (y) = y<f(x)<y+ y dxp X (x)δ ( y f(x) ) = P X ( g(y) ) g (y). Consider two random variables X 1, X 2 with a joint probability distribution P 12 (x 1, x 2 ). The probability distribution of the random variable Y = X 1 + X 2 is then determined as P Y (y) = dx 1 dx 2 P 12 (x 1, x 2 )δ(y x 1 x 2 ) = dx 1 P 12 (x 1, y x 1 ), and the probability distribution of the random variable Z = X 1 X 2 as dx1 P Z (z) = dx 1 dx 2 P 12 (x 1, x 2 )δ(z x 1 x 2 ) = x 1 P 12(x 1, z/x 1 ). If the two random variables X 1, X 2 are statistically independent we can substitute P 12 (x 1, x 2 ) = P 1 (x 1 )P 2 (x 2 ) in the above integrals. Applications: Transformation of statistical uncertainty [nex24] Chebyshev inequality [nex6] Robust probability distributions [nex19] Statistically independent or merely uncorrelated? [nex23] Sum and product of uniform distributions [nex96] Exponential integral distribution [nex79] Generating exponential and Lorentzian random numbers [nex8] From Gaussian to exponential distribution [nex8] Transforming a pair of random variables [nex78]

9 [nex127] Sums of independent exponentials Consider n independent random variable X 1,..., X n with range x i and identical exponential distributions, P 1 (x i ) = 1 ξ e xi/ξ, i = 1,..., n. Use the transformation relation from [nln49], P 2 (x) = dx 1 dx 2 P 1 (x 1 )P 1 (x 2 )δ(x x 1 x 2 ) = dx 1 P 1 (x 1 )P 1 (x x 1 ), inductively to calculate the probability distribution P n (x), n 2 of the stochastic variable X = X X n. Find the mean value X, the variance X 2, and the value x p where P n (x) has its peak value. Solution:

10 [nex24] Transformation of statistical uncertainty. From a given stochastic variable X with probability distribution P X (x) we can calculate the probability distribution of the stochastic variable Y = f(x) via the relation P Y (y) = dx P X (x)δ (y f(x)). Show by systematic expansion that if P X (x) is sufficiently narrow and f(x) sufficiently smooth, then the mean values and the standard deviations of the two stochastic variables are related to each other as follows: Y = f( X ), σ Y = f ( X ) σ X. Solution:

11 [nex6] Chebyshev s inequality Chebyshev s inequality is a rigorous relation between the standard deviation σ X = X 2 X 2 of the random variable X and the probability of deviations from the mean value X greater than a given magnitude a. ( P [(x X ) 2 > a 2 σx ) 2 ] a Prove Chebyshev s inequality starting from the following relation, commonly used for the transformation of stochastic variables (as in [nln49]): P Y (y) = dx δ(y f(x))p X (x) with f(x) = (x X ) 2. Solution:

12 [nex7] Law of large numbers Let X 1,..., X N be N statistically independent random variables described by the same probability distribution P X (x) with mean value X and standard deviation σ X = X 2 X 2. These random variables might represent, for example, a series of measurements under the same (controllable) conditions. The law of large numbers states that the uncertainty (as measured by the standard deviation) of the stochastic variable Y = (X X N )/N is Prove this result. Solution: σ Y = σ X N.

13 Binomial, Poisson, and Gaussian Distributions [nln8] Consider a set of N independent experiments, each having two possible outcomes occurring with given probabilities. events A + B = S probabilities p + q = 1 random variables n + m = N Binomial distribution: P N (n) = Mean value: n = Np. Variance: n 2 = Npq. N! n!(n n)! pn (1 p) N n. [nex15]] In the following we consider two different asymptotic distributions in the limit N. Poisson distribution: Limit #1: N, p such that Np = n = a stays finite [nex15]. Cumulants: n m = a. P (n) = an n! e a. Factorial cumulants: n m f = aδ m,1. Single parameter: n = n 2 = a. [nex16] Gaussian distribution: Limit #2: N 1, p > with Np Npq. ( ) 1 P N (n) = 2π n2 exp (n n )2. 2 n 2 Derivation: DeMoivre-Laplace limit theorem [nex21]. Two parameters: n = Np, n 2 = Npq. Special case of central limit theorem [nln9].

14 [nex15] Binomial to Poisson distribution Consider the binomial distribution for two events A, B that occur with probabilities P (A) p, P (B) = 1 p q, respectively: P N (n) = N! n!(n n)! pn q N n, where N is the number of (independent) experiments performed, and n is the stochastic variable that counts the number of realizations of event A. (a) Find the mean value n and the variance n 2 of the stochastic variable n. (b) Show that for N, p with Np a >, the binomial distribution turns into the Poisson distribution P (n) = an n! e a. Solution:

15 [nex21] De Moivre Laplace limit theorem. Show that for large Np and large Npq the binomial distribution turns into the Gaussian distribution with the same mean value n = Np and variance n 2 = Npq: ( ) N! P N (n) = n!(n n)! pn q N n 1 P N (n) 2π n2 exp (n n )2 2 n 2. Solution:

16 Central Limit Theorem [nln9] The central limit theorem is a major extension of the law of large numbers. It explains the unique role of the Gaussian distribution in statistical physics. Given are a large number of statistically independent random variables X i, i = 1,..., N with equal probability distributions P X (x i ). The only restriction on the shape of P X (x i ) is that the moments X n i = X n are finite for all n. Goal: Find the probability distribution P Y (y) for the random variable Y = (X 1 X + + X N X )/N. ( ) P Y (y) = dx 1 P X (x 1 ) dx N P X (x N )δ y 1 N [x i X ]. N Characteristic function: Φ Y (k) dy e iky P Y (y), P Y (y) = 1 2π Φ Y (k) = ( ) k Φ N = dx 1 P X (x 1 ) = [ Φ (k/n) ] N, = dx N P X (x N ) exp dx e i(k/n)(x X ) P X (x) = exp ( ( ) 2 ( ) k k X O, N N i=1 dk e iky Φ Y (k). ( i k N ) N [x i X ] i=1 ( ) ) 2 k X 2 + N where we have performed a cumulant expansion to leading order. Φ Y (y) = [ ( )] 1 k2 X 2 k 3 N ( ) N + O exp k2 X 2. 2N 2 N 3 2N where we have used lim N (1 + z/n) N = e z. P Y (y) = N 2π X 2 exp with variance Y 2 = X 2 /N ) ( Ny2 = 2 X 2 1 2π Y 2 e y2 /2 Y 2 Note that regardless of the form of P X (x), the average of a large number of (independent) measurements of X will be a Gaussian with standard deviation σ Y = σ X / N.

17 [nex19] Robust probability distributions Consider two independent stochastic variables X 1 and X 2, each specified by the same probability distribution P X (x). Show that if P X (x) is either a Gaussian, a Lorentzian, or a Poisson distribution, (i) P X (x) = 1 2πσ e x2 /2σ 2, (ii) P X (x) = 1 π a x 2 + a 2, (iii) P X(x = n) = an n! e a. then the probability distribution P Y (y) of the stochastic variable Y = X 1 + X 2 is also a Gaussian, a Lorentzian, or a Poisson distribution, respectively. What property of the characteristic function Φ X (k) guarantees the robustness of P X (x)? Solution:

18 [nex81] Stable probability distributions Consider N independent random variables X 1,..., X N, each having the same probability distribution P X (x). If the probability distribution of the random variable Y N = X X N can be written in the form P Y (y) = P X (y/c N +γ N )/c N, then P X (x) is stable. The multiplicative constant must be of the form c N = N 1/α, where α is the index of the stable distribution. P X (x) is strictly stable if γ N =. Use the results of [nex19] to determine the indices α of the Gaussian and Lorentzian distributions, both of which are both strictly stable. Show that the Poisson distribution is not stable in the technical sense used here. Solution:

19 Exponential distribution [nln1] Busses arrive randomly at a bus station. The average interval between successive bus arrivals is τ. f(t)dt: P (t) = probability that the interval is between t and t + dt. t Relation: f(t) = dp dt. dt f(t ): probability that the interval is larger than t. Normalizations: P () = 1, Mean value: t dt tf(t) = τ. dt f(t) = 1. Start the clock when a bus has arrived and consider the events A and B. Event A: the next bus has not arrived by time t. Event B: a bus arrives between times t and t + dt. Assumptions: 1. P (AB) = P (A)P (B) (statistical independence). 2. P (B) = cdt with c to be determined. Consequence: P (t + dt) = P (A B) = P (A)P ( B) = P (t)[1 cdt]. d dt P (t) = cp (t) P (t) = e ct f(t) = ce ct. Adjust mean value: t = τ c = 1/τ. Exponential distribution: P (t) = e t/τ, f(t) = 1 τ e t/τ. Find the probability P n (t) that n busses arrive before time t. First consider the probabilities f(t )dt and P (t t ) of the two statistically independent events that the first bus arrives between t and t + dt and that no futher bus arrives until time t. Probability that exactly one bus arrives until time t: P 1 (t) = t Then calculate P n (t) by induction. Poisson distribution: P n (t) = dt f(t )P (t t ) = t τ e t/τ. t dt f(t )P n 1 (t t ) = (t/τ)n e t/τ. n!

20 Waiting Time Problem [nln11] Busses arrive more or less randomly at a bus station. Given is the probability distribution f(t) for intervals between bus arrivals. Normalization: dt f(t) = 1. Probability that the interval is larger than t: P (t) = Mean time interval between arrivals: τ B = t dt tf(t) = dt f(t ). dtp (t). Find the probability Q (t) that no arrivals occur in a randomly chosen time interval of length t. First consider the probability P (t + t) for this to be the case if the interval starts at time t after the last bus arrival. Then average P (t + t) over the range of elapsed time t. Q (t) = c dt P (t + t) with normalization Q () = 1. Q (t) = 1 τ B t dt P (t ). Passengers come to the station at random times. The probability that a passenger has to wait at least a time t before the next bus is then Q (t): Probabilty distribution of passenger waiting times: Mean passenger waiting time: τ P = g(t) = d dt Q (t) = 1 τ B P (t). dt tg(t) = dtq (t). The relationship between τ B and τ P depends on the distribution f(t). In general, we have τ P τ B. The equality sign holds for the exponential distribution.

21 [nex22] Pascal distribution. Consider the quantum harmonic oscillator in thermal equilibrium at temperature T. The energy levels (relative to the ground state) are E n = n ω, n =, 1, 2,... (a) Show that the system is in level n with probability P (n) = (1 γ)γ n, γ = exp( ω/k B T ). P (n) is called Pascal distribution or geometric distribution. (b) Calculate the factorial moments n m f and the factorial cumulants n m f of this distribution. (c) Show that the Pascal distribution has a larger variance n 2 than the Poisson distribution with the same mean value n. Solution:

Contents of this Document [ntc5]

Contents of this Document [ntc5] Contents of this Document [ntc5] 5. Random Variables: Applications Reconstructing probability distributions [nex14] Probability distribution with no mean value [nex95] Variances and covariances [nex20]

More information

10. Zwanzig-Mori Formalism

10. Zwanzig-Mori Formalism University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 0-9-205 0. Zwanzig-Mori Formalism Gerhard Müller University of Rhode Island, gmuller@uri.edu Follow

More information

11. Recursion Method: Concepts

11. Recursion Method: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 16 11. Recursion Method: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

03. Simple Dynamical Systems

03. Simple Dynamical Systems University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 03. Simple Dynamical Systems Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

14. Ideal Quantum Gases II: Fermions

14. Ideal Quantum Gases II: Fermions University of Rhode Island DigitalCommons@URI Equilibrium Statistical Physics Physics Course Materials 25 4. Ideal Quantum Gases II: Fermions Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

10. Zwanzig-Mori Formalism

10. Zwanzig-Mori Formalism University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 205 0. Zwanzig-Mori Formalism Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

07. Stochastic Processes: Applications

07. Stochastic Processes: Applications University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 10-19-2015 07. Stochastic Processes: Applications Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 18. RL Circuits Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Table of Contents [ntc]

Table of Contents [ntc] Table of Contents [ntc] 1. Introduction: Contents and Maps Table of contents [ntc] Equilibrium thermodynamics overview [nln6] Thermal equilibrium and nonequilibrium [nln1] Levels of description in statistical

More information

08. Brownian Motion. University of Rhode Island. Gerhard Müller University of Rhode Island,

08. Brownian Motion. University of Rhode Island. Gerhard Müller University of Rhode Island, University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 1-19-215 8. Brownian Motion Gerhard Müller University of Rhode Island, gmuller@uri.edu Follow this

More information

06. Stochastic Processes: Concepts

06. Stochastic Processes: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 2015 06. Stochastic Processes: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI Equilibrium Statistical Physics Physics Course Materials 2015 07. Kinetic Theory I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 09. Resistors I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

02. Newtonian Gravitation

02. Newtonian Gravitation University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 02. Newtonian Gravitation Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

13. Rigid Body Dynamics II

13. Rigid Body Dynamics II University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 13. Rigid Body Dynamics II Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

15. Hamiltonian Mechanics

15. Hamiltonian Mechanics University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 15. Hamiltonian Mechanics Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 11. RC Circuits Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 07. Capacitors I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

12. Rigid Body Dynamics I

12. Rigid Body Dynamics I University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 015 1. Rigid Body Dynamics I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

05. Electric Potential I

05. Electric Potential I University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 05. Electric Potential I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons

More information

13. Ideal Quantum Gases I: Bosons

13. Ideal Quantum Gases I: Bosons University of Rhode Island DigitalCommons@URI Equilibrium Statistical Physics Physics Course Materials 5 3. Ideal Quantum Gases I: Bosons Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

10. Scattering from Central Force Potential

10. Scattering from Central Force Potential University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 215 1. Scattering from Central Force Potential Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 10. Resistors II Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

02. Electric Field II

02. Electric Field II Universit of Rhode Island DigitalCommons@URI PHY 24: Elementar Phsics II Phsics Course Materials 215 2. Electric Field II Gerhard Müller Universit of Rhode Island, gmuller@uri.edu Creative Commons License

More information

8.04 Quantum Physics Lecture IV. ψ(x) = dkφ (k)e ikx 2π

8.04 Quantum Physics Lecture IV. ψ(x) = dkφ (k)e ikx 2π Last time Heisenberg uncertainty ΔxΔp x h as diffraction phenomenon Fourier decomposition ψ(x) = dkφ (k)e ikx π ipx/ h = dpφ(p)e (4-) πh φ(p) = φ (k) (4-) h Today how to calculate φ(k) interpretation of

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

16.4. Power Series. Introduction. Prerequisites. Learning Outcomes

16.4. Power Series. Introduction. Prerequisites. Learning Outcomes Power Series 6.4 Introduction In this Section we consider power series. These are examples of infinite series where each term contains a variable, x, raised to a positive integer power. We use the ratio

More information

Linear second-order differential equations with constant coefficients and nonzero right-hand side

Linear second-order differential equations with constant coefficients and nonzero right-hand side Linear second-order differential equations with constant coefficients and nonzero right-hand side We return to the damped, driven simple harmonic oscillator d 2 y dy + 2b dt2 dt + ω2 0y = F sin ωt We note

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Physics 6720 Introduction to Statistics April 4, 2017

Physics 6720 Introduction to Statistics April 4, 2017 Physics 6720 Introduction to Statistics April 4, 2017 1 Statistics of Counting Often an experiment yields a result that can be classified according to a set of discrete events, giving rise to an integer

More information

Solutions to Homework 2

Solutions to Homework 2 Solutions to Homewor Due Tuesday, July 6,. Chapter. Problem solution. If the series for ln+z and ln z both converge, +z then we can find the series for ln z by term-by-term subtraction of the two series:

More information

1 Assignment 1: Nonlinear dynamics (due September

1 Assignment 1: Nonlinear dynamics (due September Assignment : Nonlinear dynamics (due September 4, 28). Consider the ordinary differential equation du/dt = cos(u). Sketch the equilibria and indicate by arrows the increase or decrease of the solutions.

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

06. Lagrangian Mechanics II

06. Lagrangian Mechanics II University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 06. Lagrangian Mechanics II Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Measure-theoretic probability

Measure-theoretic probability Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Chapter 7. Basic Probability Theory

Chapter 7. Basic Probability Theory Chapter 7. Basic Probability Theory I-Liang Chern October 20, 2016 1 / 49 What s kind of matrices satisfying RIP Random matrices with iid Gaussian entries iid Bernoulli entries (+/ 1) iid subgaussian entries

More information

PROBABILITY DISTRIBUTION

PROBABILITY DISTRIBUTION PROBABILITY DISTRIBUTION DEFINITION: If S is a sample space with a probability measure and x is a real valued function defined over the elements of S, then x is called a random variable. Types of Random

More information

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D 3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v.

More information

Section Taylor and Maclaurin Series

Section Taylor and Maclaurin Series Section.0 Taylor and Maclaurin Series Ruipeng Shen Feb 5 Taylor and Maclaurin Series Main Goal: How to find a power series representation for a smooth function us assume that a smooth function has a power

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

SOLUTIONS for Homework #2. 1. The given wave function can be normalized to the total probability equal to 1, ψ(x) = Ne λ x.

SOLUTIONS for Homework #2. 1. The given wave function can be normalized to the total probability equal to 1, ψ(x) = Ne λ x. SOLUTIONS for Homework #. The given wave function can be normalized to the total probability equal to, ψ(x) = Ne λ x. () To get we choose dx ψ(x) = N dx e λx =, () 0 N = λ. (3) This state corresponds to

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

Lecture 5: Moment Generating Functions

Lecture 5: Moment Generating Functions Lecture 5: Moment Generating Functions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 28th, 2018 Rasmussen (CUED) Lecture 5: Moment

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 08. Capacitors II Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons

More information

Information geometry for bivariate distribution control

Information geometry for bivariate distribution control Information geometry for bivariate distribution control C.T.J.Dodson + Hong Wang Mathematics + Control Systems Centre, University of Manchester Institute of Science and Technology Optimal control of stochastic

More information

Statistics, Probability Distributions & Error Propagation. James R. Graham

Statistics, Probability Distributions & Error Propagation. James R. Graham Statistics, Probability Distributions & Error Propagation James R. Graham Sample & Parent Populations Make measurements x x In general do not expect x = x But as you take more and more measurements a pattern

More information

Basic Concepts and Tools in Statistical Physics

Basic Concepts and Tools in Statistical Physics Chapter 1 Basic Concepts and Tools in Statistical Physics 1.1 Introduction Statistical mechanics provides general methods to study properties of systems composed of a large number of particles. It establishes

More information

07. Stochastic Processes: Applications

07. Stochastic Processes: Applications University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 2015 07. Stochastic Processes: Applications Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

Mathematics 324 Riemann Zeta Function August 5, 2005

Mathematics 324 Riemann Zeta Function August 5, 2005 Mathematics 324 Riemann Zeta Function August 5, 25 In this note we give an introduction to the Riemann zeta function, which connects the ideas of real analysis with the arithmetic of the integers. Define

More information

Modeling the Goodness-of-Fit Test Based on the Interval Estimation of the Probability Distribution Function

Modeling the Goodness-of-Fit Test Based on the Interval Estimation of the Probability Distribution Function Modeling the Goodness-of-Fit Test Based on the Interval Estimation of the Probability Distribution Function E. L. Kuleshov, K. A. Petrov, T. S. Kirillova Far Eastern Federal University, Sukhanova st.,

More information

Stochastic Processes. Monday, November 14, 11

Stochastic Processes. Monday, November 14, 11 Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Frank Porter April 3, 2017

Frank Porter April 3, 2017 Frank Porter April 3, 2017 Chapter 1 Probability 1.1 Definition of Probability The notion of probability concerns the measure ( size ) of sets in a space. The space may be called a Sample Space or an Event

More information

Chapter 7: Special Distributions

Chapter 7: Special Distributions This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ) Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Multivariate Distribution Models

Multivariate Distribution Models Multivariate Distribution Models Model Description While the probability distribution for an individual random variable is called marginal, the probability distribution for multiple random variables is

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

1 Introduction. 2 Measure theoretic definitions

1 Introduction. 2 Measure theoretic definitions 1 Introduction These notes aim to recall some basic definitions needed for dealing with random variables. Sections to 5 follow mostly the presentation given in chapter two of [1]. Measure theoretic definitions

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Engg. Math. I. Unit-I. Differential Calculus

Engg. Math. I. Unit-I. Differential Calculus Dr. Satish Shukla 1 of 50 Engg. Math. I Unit-I Differential Calculus Syllabus: Limits of functions, continuous functions, uniform continuity, monotone and inverse functions. Differentiable functions, Rolle

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO

BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO ACKGROUND NOTES FYS 4550/FYS9550 - EXERIMENTAL HIGH ENERGY HYSICS AUTUMN 2016 ROAILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO efore embarking on the concept of probability, we will first define

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 16. Faraday's Law Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons

More information

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet. 2016 Booklet No. Test Code : PSA Forenoon Questions : 30 Time : 2 hours Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

More information

Controlled Diffusions and Hamilton-Jacobi Bellman Equations

Controlled Diffusions and Hamilton-Jacobi Bellman Equations Controlled Diffusions and Hamilton-Jacobi Bellman Equations Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Winter 2014 Emo Todorov (UW) AMATH/CSE 579, Winter

More information

Free Meixner distributions and random matrices

Free Meixner distributions and random matrices Free Meixner distributions and random matrices Michael Anshelevich July 13, 2006 Some common distributions first... 1 Gaussian Negative binomial Gamma Pascal chi-square geometric exponential 1 2πt e x2

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

1 Characteristic functions

1 Characteristic functions 1 Characteristic functions Life is the sum of trifling motions. Joseph Brodsky In this chapter we introduce the characteristic function, a tool that plays a central role in the mathematical description

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

CS-9645 Introduction to Computer Vision Techniques Winter 2018

CS-9645 Introduction to Computer Vision Techniques Winter 2018 Table of Contents Spectral Analysis...1 Special Functions... 1 Properties of Dirac-delta Functions...1 Derivatives of the Dirac-delta Function... 2 General Dirac-delta Functions...2 Harmonic Analysis...

More information