ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
|
|
- Pamela Farmer
- 5 years ago
- Views:
Transcription
1 ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University xiao.fu@oregonstate.edu
2 From RV to Stochastic Process Recall that a RV X is a mapping from the sample space to a real number (i.e., X(s)). 5 ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 1
3 From RV to Stochastic Process A random pair is a mapping to two random variables. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 2
4 From RV to Stochastic Process A random vector is a mapping to a sequence of random variables. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 3
5 From RV to Stochastic Process A stochastic process is a mapping X(t, s) that maps an outcome to an infinitelength sequence that is indexed by time. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 4
6 Sample Path Sample Path ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 5
7 Sample Path Fixing time t = t 1, X(t 1, s) is a single RV. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 6
8 X(t,s) X(t,s) Examples Example 1: pick up a video on YouTube at random to play. Every video is a unique stream of bits. Example 2: random sinusoid X(t, s) = A(s) sin(ω(s)t + φ(s)). modulation in communications t t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 7
9 X(t,s) X(t,s) Types of Stochastic Processes Continuous time process: t is continuous. X(t, s). Discrete-time process: t is not continuous (e.g., digital signal processing). X n (s). Discrete-valued process: X(t, s) is a discrete RV. Continuous-valued process: X(t, s) is continuous RV. Q: what is the type of the following: t t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 8
10 Types of Stochastic Processes Continuous time process: t is continuous. X(t, s). Discrete-time process: t is not continuous (e.g., digital signal processing). X n (s). Discrete-valued process: X(t, s) is a discrete RV. Continuous-valued process: X(t, s) is continuous RV. Q: what about this: 5 X(t,s) t 2 X(t,s) t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 9
11 Poisson Processes of Rate λ Motivation: we wish to model the number of data packages arriving at a data center over time; or the number of customer arriving at a mall over time. Definition: Poisson Process of Rate λ, denoted by N(t, s) (abused notation N(t) since we know that s is always playing role). 1. N(t) = 0, t < 0; 2. for all t > t 0, the increment N(t 1 ) N(t 0 ) is a Poisson RV with mean λ(t 1 t 0 ). 3. if [t 0, t 1 ] and [t 0, t 1] are non-overlapping, then, the corresponding increments, are independent RVs. Note: Poisson RV with mean α > 0 N(t 1 ) N(t 0 ), N(t 1) N(t 0) P N (n) = {α ne α n!, n = 0, 1, 2,... 0, o.w. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 10
12 Poisson Processes of Rate λ Illustration ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 11
13 Poisson Processes of Rate λ Example: Let us assume t 1 t 2 t 3 and n 1 n 2 n 3. What is the joint PMF P N(t1 ),N(t 2 ),N(t 3 )(n 1, n 2, n 3 )? ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 12
14 Poisson Processes of Rate λ We are interested in the joint Probability P [N(t 1 ) = n 1, N(t 2 ) = n 2, N(t 3 ) = n 3 ] = P [N(t 1 ) N(0) = n 1, N(t 2 ) N(t 1 ) = n 2 n 1, N(t 3 ) N(t 2 ) = n 3 n 2 ] = P [N(t 1 ) N(0) = n 1 ]P [N(t 2 ) N(t 1 ) = n 2 n 1 ]P [N(t 3 ) N(t 2 ) = n 3 n 2 ] ( λt n 1 ) ( 1 λ(t2 t = n 1! e λt 1 1 ) n ) ( 2 n 1 e λ(t 2 t 1 ) λ(t3 t 2 ) n ) 3 n 2 e λ(t 3 t 2 ) (n 2 n 1 )! (n 3 n 2 )! ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 13
15 Arrival Time From the Poisson process, one can also have characteristics of the arrival times. Let N(t) denote the number of customers that one observe at time t, which is a Poisson process. The time that the first customer arrives is a random variable. The inter-arrival time is also random. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 14
16 Arrival Time Let us consider the arrival time of the first customer, X 1. What is the PDF? We start with the CDF and P [X 1 x 1 ]. This is not easy to compute, but we may compute P [X 1 > x 1 ] = P [no arrival until time point x 1 ] (note that the number of arrivals between t = 0 and t = x 1 is a Poisson RV with mean λ(x 1 0)): P [X 1 > x 1 ] = P [N(x 1 ) N(0) = 0] = λx0 1 0! e λx 1 = e λx 1 Hence, F X1 (x 1 ) = P [X 1 x 1 ] = 1 e λx 1. The PDF is f X1 (x 1 ) = df X 1 (x 1 ) x 1 = λe λx 1 for x 1 0: Beautiful! f X1 (x 1 ) = { λe λx 1, x 1 0 0, o.w. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 15
17 Inter-arrival Time What is the PDF X 2, the first inter-arrival time? What we know is that {X 1 = x 1 } has already happened; and N(x 1 ) = 1. P [X 2 > x 2 X 1 = x 1 ] = P [N(x 1 + x 2 ) N(x 1 ) = 0 N(x 1 ) = 1] = P [N(x 1 + x 2 ) N(x 1 ) = 0 N(x 1 ) N(0) = 1] = P [N(x 1 + x 2 ) N(x 1 ) = 0] = e λx 2 The above has nothing to do with x 1 X 2 and X 1 are independent; and { 1 e λx 2, x 2 > 0 F X2 (x 2 ) = 0 o.w. x 2 is also an exponentially distributed RV! For Poisson N(t) of rate λ, {X i } i=1 : i.i.d. exponential RVs. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 16
18 Brownian Motion Process Definition: The continuous time Brownian Motion: W (t) such that W (t) t=0 = W (0) = 0, W (t + τ) W (t) N (0, σ 2 = ατ), i.e., W (t + τ) W (t) is a Gaussian RV with variance ατ. Discrete-Time Brownian Motion: X n+1 = X n + W n+1, X 0 = 0, W n N (0, σ 2 ), {W n } n=1, i.i.d. X 1 = X 0 + W 1 X 2 = X 1 + W 2 = W 1 + W 2 X 3 = X 2 + W 3 = W 1 + W 2 + W 3. x n = { n i=1 W n, n 1 0, n 0. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 17
19 Brownian Motion Process E[X n ] = E[ n i=1 W i] = n i=1 E[W i] = 0. Var[X n ] = Var[ n i=1 W i] = nσ 2 (the variance goes unbounded when n ). Let Z n = (1/n)X n = (1/n) n i=1 W i. The factor 1/n matters so much! E[Z n ] = 0, Var[Z n ] = (1/n) 2 Var[X n ] = σ2 n. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 18
20 Basic Statistics of Stochastic Process Definition: Expected value function of stochastic process X(t) is defined as µ X (t) = E[X(t)]. Note: µ X (t) is a deterministic function that gives the mean of X(t) for all t. Discrete-time: µ X [n] = E[X n ], for all n Z. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 19
21 Basic Statistics of Stochastic Process Example: Random amplitude cosine process: X(t) = A cos(ωt + φ) = A(s) cos(ωt + φ). }{{} random X(t) X(t) X(t) t t t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 20
22 Basic Statistics of Stochastic Process We can compute µ X (t) = E[X(t)] = E[A cos(ωt + φ)] = E[A] cos(ωt + φ) E.g., if A N (0, 1), then we have µ X (t) = 0, t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 21
23 Basic Statistics of Stochastic Process Definition: Auto-covariance of random process: C X (t, τ) = Cov[X(t), X(t + τ)] Discrete-time: C X [m, k] = Cov[X m, X m+k ] Definition: Auto-correlation of random process: R X (t, τ) = E[X(t)X(t + τ)] R X [m, k] = E[X m, X m+k ] C X (t, τ) = R X (t, τ) µ X (t)µ X (t + τ) C X [m, k] = R X [m, k] µ X [m]µ X [m + k] ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 22
24 Stationary Process Let us look at a particular time t 1 : X(t 1 ) is a RV. The PDF f X(t1 )(x), generally speaking, is a function of t. Definition: X(t) is stationary if and only if joint PDF f X(t1 ),...,X(t m )(x 1,..., x m ) = f X(t1 +τ),...,x(t m +τ)(x 1,..., x m ), τ, m Hence if X(t) is stationary, f X(t) (x) is the same for all t. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 23
25 Stationary Process Example: {W n } n= : i.i.d. Gaussian (WGN). Is it stationary? How to check? f W1 (w 1 ) = f W1+q (w 1 )? f W1,W 2 (w 1, w 2 ) = f W1+q,W 2+q (w 1, w 2 )? i.i.d. = Stationary. (The converse is not true). Example: X n (s) = A(s). Given s, A is fixed (P Xn1,X n2 (x 1, x 2 ) = P [A 2 ]); always stationary, but not independent. Example: Discrete-time Brownian Motion X n : Var[X n ] = nσ 2. Var[X 1 ] = σ 2 and Var[X 100 ] = 100σ 2. Cannot have the same PDFs. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 24
26 Stationary Process Theorem: If X(t) is a stationary process, then we have µ X (t) = µ X, t and R X (t, τ) = E[X(t)X(t + τ)] = R X (0, τ) = R X (τ). These are necessary conditions of being stationary. Proof: µ X (t) = E[X(t)] = = x= x= xf X(t) (x)dx xf X(0) (x)dx = µ X where we have used stationarity f X(t) (x) = f X(0) (x). t, ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 25
27 For the auto-correlation part: R X (t, τ) = E[X(t)X(t + τ)] = = Stationary Process x 1 = x 2 = x 1 = x 2 = = R X (0, τ). x 1 x 2 f X(t),X(t+τ) (x 1, x 2 )dx 1 dx 2 x 1 x 2 f X(0),X(τ) (x 1, x 2 )dx 1 dx 2 ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 26
28 Stationary Process Necessary conditions are used for disqualifying X(t) as a stationary process. Example: Y (t) = A cos(2πf c t + θ); A N (0, 1) is random. Is Y (t) stationary? Sanity check: E[Y (t)] = E[A] cos(2πf c t + θ) = 0. Let 2πf c t+θ = π/2+2kπ for k Z. There exist points t : πf c t +θ = π/2+2kπ where Y (s, t ) = 0 for all s. Can this be stationary? R X (t, τ) = E[X(t )X(t + τ)] =? ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 27
29 Wide Sense Stationary (WSS) Process Definition: X(t) is WSS if and only if E[X(t)] = µ X, t R R X (t, τ) = E[X(t)X(t + τ)] = R X (0, τ), t, τ Example: Y (t) = A cos(2πf c t + θ); θ U[0, 2π] is random. Is Y (t) WSS? Let α(t) = 2πf c t E[Y (t)] = AE[cos(α(t) + θ)] = A = A 2π 2π θ=0 2π θ=0 cos(α(t) + θ)dθ = 0. cos(α(t) + θ) 1 2π dθ ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 28
30 In addition, we have Wide Sense Stationary (WSS) Process R X (t, τ) = E[X(t)X(t + τ)] = A 2 E[cos(2πf c t + θ) cos(2πf c (t + τ) + θ)] Recall that Hence, we have cos A cos B = 1 2 cos(a B) cos(a + B). R X (t, τ) = A2 2 2π θ=0 2π + A2 = A2 2 2 θ=0 2π θ=0 cos(4πf c t + 2πf c τ + 2θ)dθ cos( 2πf c τ)dθ cos(2πf c τ)dθ = R X (0, τ). ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 29
ECE 353 Probability and Random Signals - Practice Questions
ECE 353 Probability and Random Signals - Practice Questions Winter 2018 Xiao Fu School of Electrical Engineering and Computer Science Oregon State Univeristy Note: Use this questions as supplementary materials
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable
ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More information2 Continuous Random Variables and their Distributions
Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationECE 541 Stochastic Signals and Systems Problem Set 11 Solution
ECE 54 Stochastic Signals and Systems Problem Set Solution Problem Solutions : Yates and Goodman,..4..7.3.3.4.3.8.3 and.8.0 Problem..4 Solution Since E[Y (t] R Y (0, we use Theorem.(a to evaluate R Y (τ
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationRandom Process. Random Process. Random Process. Introduction to Random Processes
Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,
More informationRandom Processes Why we Care
Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal
More informationECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation
ECE353: Probability and Random Processes Lecture 5 - Cumulative Distribution Function and Expectation Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationVariance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18
Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationProving the central limit theorem
SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit
More informationUCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7
UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn
More informationStochastic Processes. Chapter Definitions
Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationUCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011
UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,
More informationStationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.
Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of
More informationChapter 2 Random Processes
Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationFINAL EXAM: 3:30-5:30pm
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.
More informationMassachusetts Institute of Technology
Problem. (0 points) Massachusetts Institute of Technology Final Solutions: December 15, 009 (a) (5 points) We re given that the joint PDF is constant in the shaded region, and since the PDF must integrate
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationUCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationOptimization and Simulation
Optimization and Simulation Variance reduction Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale de Lausanne M.
More informationUCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7
UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationStochastic Process II Dr.-Ing. Sudchai Boonto
Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationChapter 4 Random process. 4.1 Random process
Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationECE534, Spring 2018: Solutions for Problem Set #5
ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationThe distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)
Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable
More informationBasics of Stochastic Modeling: Part II
Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More information3. Poisson Processes (12/09/12, see Adult and Baby Ross)
3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous
More information1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions
Copyright c 2007 by Karl Sigman 1 Simulating normal Gaussian rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions Fundamental to many applications
More informationF X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.
10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationProperties of the Autocorrelation Function
Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R
More informationECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else
ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5
More informationFinal Solutions Fri, June 8
EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.
ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More information(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only
More informationChapter 6. Random Processes
Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationContinuous Distributions
Continuous Distributions 1.8-1.9: Continuous Random Variables 1.10.1: Uniform Distribution (Continuous) 1.10.4-5 Exponential and Gamma Distributions: Distance between crossovers Prof. Tesler Math 283 Fall
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationRandom Processes. DS GA 1002 Probability and Statistics for Data Science.
Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationECE Homework Set 3
ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationPoisson processes and their properties
Poisson processes and their properties Poisson processes. collection {N(t) : t [, )} of rando variable indexed by tie t is called a continuous-tie stochastic process, Furtherore, we call N(t) a Poisson
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100
ECE302 Spring 2006 Practice Final Exam Solution May 4, 2006 1 Name: Score: /100 You must show ALL of your work for full credit. This exam is open-book. Calculators may NOT be used. 1. As a function of
More informationECSE B Solutions to Assignment 8 Fall 2008
ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant
More informationFINAL EXAM: Monday 8-10am
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.
More informationEE 3025 S2010 Demo 10 Apr 19-20, Reading Assignment: Read Sections 9.5 and of the EE 3025 Matlab Notes.
EE 3025 S2010 Demo 10 Apr 19-20, 2010 Reading Assignment: Read Sections 9.5 and 10.1-10.5 of the EE 3025 Matlab Notes. Part I(25 min): Matlab Part II(25 min): Worked Problems on Chap 10 1 Matlab 1.1 Estimating
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationIEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013
IEOR 316: Second Midterm Exam, Chapters 5-6, November 7, 13 SOLUTIONS Honor Code: Students are expected to behave honorably, following the accepted code of academic honesty. You may keep the exam itself.
More informationIntroduction to Probability and Stochastic Processes I
Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationFinal. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify
More informationChapter 5 Random Variables and Processes
Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability
More informationReview of Mathematical Concepts. Hongwei Zhang
Review of Mathematical Concepts Hongwei Zhang http://www.cs.wayne.edu/~hzhang Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable
More informationECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.
NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this
More informationStat 426 : Homework 1.
Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 4 Ergodic Random Processes, Power Spectrum Linear Systems 0 c 2011, Georgia Institute of Technology (lect4 1) Ergodic Random Processes An ergodic random process is one
More information7 The Waveform Channel
7 The Waveform Channel The waveform transmitted by the digital demodulator will be corrupted by the channel before it reaches the digital demodulator in the receiver. One important part of the channel
More information1 Expectation of a continuously distributed random variable
OCTOBER 3, 204 LECTURE 9 EXPECTATION OF A CONTINUOUSLY DISTRIBUTED RANDOM VARIABLE, DISTRIBUTION FUNCTION AND CHANGE-OF-VARIABLE TECHNIQUES Expectation of a continuously distributed random variable Recall
More information