Probability and Statistics

Similar documents
Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stochastic Processes. Monday, November 14, 11

Random Processes Why we Care

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Stochastic Processes

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Chapter 4 Random process. 4.1 Random process

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

Chapter 6: Random Processes 1

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Properties of the Autocorrelation Function

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Chapter 6. Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Probability and Statistics for Final Year Engineering Students

Statistical signal processing

Fig 1: Stationary and Non Stationary Time Series

Statistics, Probability Distributions & Error Propagation. James R. Graham

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Introduction to Probability and Stochastic Processes I

6 The normal distribution, the central limit theorem and random samples

TSKS01 Digital Communication Lecture 1

Problems on Discrete & Continuous R.Vs

Chapter 5 Random Variables and Processes

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Lecture - 30 Stationary Processes

Review of Probability. CS1538: Introduction to Simulations

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

Random Variables Example:

Chapter 5. Means and Variances

Binomial and Poisson Probability Distributions

Name of the Student: Problems on Discrete & Continuous R.Vs

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Basics on Probability. Jingrui He 09/11/2007

Lecture 2: Repetition of probability theory and statistics

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Lecture 4 - Random walk, ruin problems and random processes

Question Paper Code : AEC11T03

EE 121: Introduction to Digital Communication Systems. 1. Consider the following discrete-time communication system. There are two equallly likely

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Review of Mathematical Concepts. Hongwei Zhang

Econ 424 Time Series Concepts

1 INFO Sep 05

Stochastic Processes. A stochastic process is a function of two variables:

1: PROBABILITY REVIEW

Chapter 6 - Random Processes

Final Examination Solutions (Total: 100 points)

1 Presessional Probability

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

ECE Homework Set 3

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Lecture 3. Discrete Random Variables

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Discrete Random Variables

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Discrete Random Variables

Introduction to Statistics and Error Analysis

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Lycka till!

Algorithms for Uncertainty Quantification

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Random Process Review

Probability reminders

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems

Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

Stat 134 Fall 2011: Notes on generating functions

Data, Estimation and Inference

Lecture 4: Sampling, Tail Inequalities

Econometría 2: Análisis de series de Tiempo

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

PHYS 114 Exam 1 Answer Key NAME:

Expectation of Random Variables

Chapter 7: Theoretical Probability Distributions Variable - Measured/Categorized characteristic

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander

04. Random Variables: Concepts

Preliminary statistics

1 Generating functions

A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

16.584: Random (Stochastic) Processes

Discrete Distributions

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Lecture 2 Binomial and Poisson Probability Distributions

1 Random variables and distributions

6.041/6.431 Fall 2010 Quiz 2 Solutions

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Dennis Bricker Dept of Mechanical & Industrial Engineering The University of Iowa

Transcription:

Probability and Statistics 1

Contents some stochastic processes Stationary Stochastic Processes 2

4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph process 3

4.1 Bernoulli Process Definition The infinite sequence of random variables is called a random sequence in particular 1 and 0 n are statistically independent P[ 1] p P[ 0] 1 p q n {,,, } {, n 1,2,3, } 1 2 3 n n is a Bernoulli random variable, if n is fixed. { n, n 1, 2,3, } is called a Bernoulli random process 4

Probability Distribution Unending sequence of flips of a coin. Flip a coin at each positive-integer value of time (starting at time 1) and observe the result. To determine the joint probability flipping a coin, a finite number of times n. 1 If a head results from the coin flip at time n n 0 If not n are statistically independent the joint-probability distribution is the set of P[ x, x, x ] 1 1 2 2 n P[ x ] P[ x ] P[ x ] 1 2 n 1 1 2 2 n xi n 0,1 2 n probabilities n i1, 2,3, n 5

Example: two-dimensional case The joint-probability distribution is the set of probabilities and P[ 1, 1] p P[ 1, 0] pq 2 1 2 1 2 P[ 0, 1] qp P[ 0, 0] q 1 2 1 2 2 Exercise : Determine the probabilities of occurrence of the three particular Bernoulli sample sequences 6

Statistical Average E[ n] p var( ) pq p(1 p) n This expectation and variance are called the mean value and variance of the Bernoulli process 7

4.2 Binomial Process Definition A random process variable Y i1 random variables. n n i { Yn, n1, 2,3, } in which the counting random is defined to be a sum of independent Bernoulli Example: Unending sequence of flips of a coin by a zero-one Bernoulli process and count the number of heads in n flips. 8

Probability Distribution If the i can assume only the values +1 or 0,then n k P[ Yn k] P (1 P) k n n! k k!( n k)! k 0,1,2,, n nk Notice : while each counting random variable of independent random variable, the various independent. Y n Y n is the a sum are not 9

Probability Distribution E[ Yn ] np var( Yn ) npq The two difference counting random variables cov( Y, Y ) pq min( m, n) m n Yn and Ym var( Y Y ) m n pq m min( mn, ) : the smaller of the indices m and n n 10

4.3 Sine Wave Process Definition A random process { ( t), t, T} where the index set T is a continuous, and where ( t) V sin( t ) for all value of t in T,, and here are random variables V 11

Example in practice : The outputs of An electronics instrument manufacturer produces sine wave oscillators Output of a particular oscillator at any time can be characterized by the sample function x( t ) vsin( t ) 1 Outputs of various oscillator at a specified time can be characterized by the random variable Vsin( t ) t 1 1 Outputs of various oscillator at any time can be characterized by the sine wave process V sin( t ) {, t 0} t t 12

4.4 Random-telegraph process Random-telegraph process This is a real random process whose sample functions at any instant of time t may assume only the values zero or one. And it is assumed that 1 P[ ( t) 0] P[ ( t) 1] 2 The probability Pk [, ] that k transversals from one value to another occur in a time interval of length is given by the Poisson probability distribution ( ) k e P[ k, ] k 0,1,2 k! 13

The occurrence of k transversals in an interval of is statistically independent of the value assumed by any particular sample function at the start of the given interval 1 x A random-telegraph sample function t 14

Exercise-1: For the above random-telegraph process a. Show that b. Show that 1 E[ ( t)] 2 R ( t, t ) P[ ( t ) 1, ( t ) 1] 1 2 1 2 c. Show further that 1 R ( t1, t2) P[ k even] 2 Where P[ k even] is the probability that the number of transversals which occur in an interval of duration even t t 1 2 is 15

5. Strict-Sense Stationary (S.S.S) Definition if f ( x1, x2, xn, t1, t2, tn ) f ( x1, x2, xn, t1 c, t2 c, tn c) for any c, where the left side represents the joint density function of the random variables and 1 ( t1), 2 ( t2),, n ( tn ) the right side corresponds to the joint density function of the random variables t c), ( t c),, ( t 1 ( 1 2 2 n n c A process (t) is said to be strict-sense stationary if the above is true for all, i 1, 2,, n, n 1, 2, and any c. t i Stationary processes exhibit statistical properties that are invariant to shift in the time index. In strict terms, the statistical properties are governed by the joint probability density function. 16 ).

Strict-Sense Stationary (S.S.S) Probability Distribution s Properties For a first-order strict sense stationary process, ( x, t) f ( x, t c) for any c. In particular c = t gives f f ( x, t) f ( x) i.e., the first-order density of (t) is independent of t. In that case Similarly, for a second-order strict-sense stationary process we have from the previous page for any c. For c = t 2 we get f ( x, x ; t, t ) f ( x, x ; t t ) f ( x, x ; t, t ) f ( x, x ; t c, t c) 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 17

Strict-Sense Stationary (S.S.S) Statistical Average s Properties E[ ( t)] x f ( x) dx, a constant. the autocorrelation function is given by R ( t, t ) E{ ( t ) ( t )} * 1 2 1 2 x x f ( x, x, t t ) dx dx * 1 2 1 2 1 2 1 2 R t t R R * ( 1 2) ( ) ( ), i.e., the autocorrelation function of a second order strict-sense stationary process depends only on the difference of the time Indices. 18

Strict-Sense Stationary (S.S.S) Exercise-2 Consider the sine wave process {, t 0}. V cost t Show whether or not this random process is stationary in the strict sense. t 19

6. Wide-Sense Stationary (W.S.S), Definition a process (t) is said to be Wide-Sense Stationary if (i) (ii) (iii) 2 E ( t) E{ ( t)} E{ ( t ) ( t )} R ( t t ) R ( ) * 1 2 1 2 for wide-sense stationary processes, the mean is a constant and the autocorrelation function depends only on the difference between the time indices. Notice that they do not say anything about the nature of the probability density functions, and instead deal with the average behavior of the process. 20

Relationship between S.S.S and W.S.S Since they follow the previous definition, strict-sense stationarity always implies wide-sense stationarity only when the process is limited in power. The converse is not true in general, the only exception being the Gaussian process. 21

Properties of stationary stochastic process Autocorrelation Function If real random process { ( t), t } is stationary, then R ( ) R ( ) R ( ) R (0) lim R ( ) R ( ) m 2 in practice if then lim R ( ) R ( ) m 2 2 R (0) m 2 2 R R (0) ( ) 22

7. Jointly Stationarity 1) Definition A pair of real random process { ( t), t } and { Y( t), t } are jointly stationary in wide sense, when E[ ( t)] m E[ Y ( t)] m R ( t, t ) R ( ) R ( t, t ) R ( ) R ( t, t ) R ( ) R ( t, t ) R ( ) for all values t Y Y Y Y Y Y Y 23

Exercise-3 Let the two random process { U( t), t } { V ( t), t } be such that U( t) cos t Y sin t V ( t) Y cost sint for all t, where and Y are independent real random variables for which E E Y E E Y 2 2 [ ] 0 [ ] [ ] 1 [ ] and a. Show that the two processes are individually stationary in the wide sense. b. Show that they are not jointly stationary in the wide sense. 24

2) Properties of Cross-correlation Function The two real random process { ( t), t } and { Y( t), t } are individually stationary in the wide sense and are jointly stationary in the wide sense. Then R Y ( ) R ( ) Y 2 RY ( ) R (0) RY (0) 2 2 2 E Y E E Y This two processes are jointly stationary, then Z( t) ( t) Y( t) is stationary. 25

3) Cross-covariance Function The two random process { ( t), t } and { Y( t), t } are individually stationary in the wide sense and are jointly stationary in the wide sense. Then C Y ( ) C ( ) Y C ( ) 2 2 2 Y Y 26

Homework 10.6,10.8,10.9, 10.10, 10.13,10-20 27