UNIT-1: RANDOM PROCESS: Random variables: Several random variables. Statistical averages: Function o Random variables, moments, Mean, Correlation and Covariance unction: Principles o autocorrelation unction, cross correlation unctions. Central limit theorem, Properties o Gaussian process. Introduction: Transmission o inormation or a message rom one point to another point is called communication. The two points that wants to interact is called transmitter and receiver points or can also be reerred to as source and destination. Channel or a medium links any two points that wants to communicate. The channel can be wired (guided) or wireless (unguided). The inormation or a message is represented in the orm o a signal on the channel depending on the nature o the channel. For eample i the medium is a twisted pair or coaial cable then the signal is electrical i the medium is optical then the signal is optical or light. The signal can be analog or digital. I the signal is analog it is continuous and i it is digital it uses discrete representation. This version can be achieved with the application o base band approach like amplitude (AM), requency (FM) and pulse (PM) modulation schemes or the analog signal representation. Digital signals use broad band approach like amplitude (ASK), requency (FSK) and phase (PSK) shiting techniques. These signals are derived rom sources that can also be classiied as analog sources and digital sources. Eamples o analog sources are microphone, TV cameras, and or digital source computer data is a best eample. An AM radio system transmits electromagnetic waves with requencies o around a ew hundred khz (MF band). The FM radio system must operate with requencies in the range o 88-108 MHz (VHF band). The inormation transer can happen to a single point or to multiple points. I the signal transer happens on a single link to only one receiving system the communication mode is called unicast communication eg:telephone. I the signal transer happens on multiple links to several or all receivers on the network, the communication mode is then called multicast or broadcast communication eg: Radio, Television. The Basic communication system model is shown in igure 1: Poornima.G, Associate Proessor, BMSCE, B lore 1
The majors communication resources o concerns are transmitted power, channel bandwidth, noise. The average power o the transmitted signal is reerred to as the transmitted power. The Channel Bandwidth is the band o requencies allotted or transmission and noise is any unwanted signals that disturb transmission o message signal. Hence designing a communication system that makes use o the resources eiciently is important. Hence the channel can be divided as power limited and band limited. Telephone channels are band limited where as satellite or radio channels are power limited. Inormation Sources: The communication environment is dominated by inormation sources like Speech, Music, Pictures, and Computer Data. Source o inormation can be characterized in terms o signals that carry inormation. Signal is a single valued unction o time and an independent variable. The signal is single dimensional or speech, music and computer data. For pictures it s two dimensional and or video data it is three dimensional. For volume o data over time its our dimensional. The analog signal source produce continuous electrical signal with respect to time. The analog inormation source can be transormed into discrete source through the process o sampling and quantization. The discrete inormation source can be characterized as symbol rate, source alphabet, and source alphabet probabilities. Poornima.G, Associate Proessor, BMSCE, B lore 2
Communication Networks: The communication networks consist o a number o nodes or stations or processors that perorm the unction o orwarding the data rom one node/station to another node/station. The process o orwarding the data/message packets is called switching. There are three switching methods or data communication circuit switching, message switching, and packet switching. For circuit switching a dedicated path has to be provided. The link is ied and reserves the bandwidth. Packet switchinguses store and orward, the path or bandwidth is allocated on demand. Probability Theory: Statistics is branch o mathematics that deals with the collection o data. It also concerns with what can be learned rom data. Etension o statistical theory is Probability Theory. Probability deals with the result o an eperiment whose actual outcome is not known. It also deals with averages o mass phenomenon. The eperiment in which the Outcome cannot be predicted with certainty is called Random Eperiment. These eperiments can also be reerred to as a probabilistic eperiment in which more than one thing can happen. Eg: Tossing o a coin, throwing o a dice. Deterministic Model and Stochastic Model or Random Mathematical Mode can be used to describe a physical phenomenon. In Deterministic Model there is no uncertainty about its time dependent behavior. A sample point corresponds to the aggregate o all possible outcomes. Sample space or ensemble composed o unctions o time-random Process or stochastic Process. Poornima.G, Associate Proessor, BMSCE, B lore 3
1(t) is an outcome o eperiment 1 2(t) is the outcome o eperiment 2... n(t) is the outcome o eperiment n Each sample point in S is associated with a sample unction (t). X(t; s) is a random process is an ensemble o all time unctions together with a probability rule. X(t; sj) is a realization or sample unction o the random process.probability rules assign probability to any meaningul event associated with an observation An observation is a sample unction o the random process. {1(tk); 2(tk); :::; n(tk)g = X(tk; s1);x(tk; s2); :::;X(tk; sn)} X(t k ; s j ) constitutes a random variable. Outcome o an eperiment mapped to a real number. An oscillator with a requency ω0 with a tolerance o 1%.The oscillator can take values between ω0 (1±0.01). Each realization o the oscillator can take any value between (ω0) (0.99) to (ω0) (1.01). The requency o the oscillator can thus be characterized by a random variable. Statistical averages are important in the measurement o quantities that are obscured by random variations. As an eample consider the problem o measuring a voltage level with a noisy instrument. Suppose that the unknown voltage has value a and that the instrument has an uncertainty. The observed value may be y = a +. Suppose that n independent measurements are made under identical conditions, meaning that neither the unknown value o the voltage nor the statistics o the instrument noise change during the process. Let us call the n measurements y i, 1 i n. Under our model o the process, it must be the case that y i = a + i. Now orm the quantity ỹ(n) = 1/ n Σ y i where i=1 n This is the empirical average o the observed values. It is important to note that ỹ (n) is a random variable because it is a numerical value that is the outcome o a random eperiment. That means that it will not have a single certain value. We epect to obtain a dierent value i we repeat the Poornima.G, Associate Proessor, BMSCE, B lore 4
eperiment and obtain n new measurements. We also epect that the result depends upon the value o n, and have the sense that larger values o n should give better results. Types o Random Variable(RV): 1. Discrete Random Variable: An RV that can take on only a inite or countably ininite set o outcomes. 2. Continuous Random Variable: An RV that can take on any value along a continuum (but may be reported discretely ).Random Variables are denoted by upper case letters (Y).Individual outcomes or RV are denoted by lower case letters ( Discrete Random Variable: Discrete Random Variable are the ones that takes on a countable number o values this means you can sit down and list all possible outcomes without missing any, although it might take you an ininite amount o time. X = values on the roll o two dice: X has to be either 2, 3, 4,, or 12. Y = number o accidents on the UTA campus during a week: Y has to be 0, 1, 2, 3, 4, 5, 6, 7, 8, real big number Probability Distribution: Table, Graph, or Formula that describes values a random variable can take on, and its corresponding probability (discrete RV) or density (continuous RV). 1. Discrete Probability Distribution: Assigns probabilities (masses) to the individual outcomes 2. Continuous Probability Distribution: Assigns density at individual points, probability o ranges can be obtained by integrating density unction 3. Discrete Probabilities denoted by: p( = P(Y= 4. Continuous Densities denoted by: ( 5. Cumulative Distribution Function: F( = P(Y For a discrete random variable, we have a probability mass unction (pm).the pm looks like a bunch o spikes, and probabilities are represented by the heights o the spikes. For a continuous random Poornima.G, Associate Proessor, BMSCE, B lore 5
variable, we have a probability density unction (pd). The pd looks like a curve, and probabilities are represented by areas under the curve. Poornima.G, Associate Proessor, BMSCE, B lore 6
Continuous Random Variable: Continuous Random Variable is usually measurement data [time, weight, distance, etc] the one that takes on an uncountable number o values this means you can never list all possible outcomes even i you had an ininite amount o time. X = time it takes you to drive home rom class: X > 0, might be 30.1 minutes measured to the nearest tenth but in reality the actual time is 30.10000001. minutes?) A continuous random variable has an ininite number o possible values & the probability o any one particular value is zero. Several Random Variables I two or more random variables are used to deine a process than it is called Several Random Variables and correspondingly the Joint distribution unction (JDF) can be deined. For eample i X and Y are two random variables then F, Y (, is the JDF and F Y Suppose JDF X also continuous X (, P( X, Y ) F X, Y y, (, is continuous everywhere then the partial derivative is FX, Y (, X, Y (, y F XY (, is monotone non-decreasing unction o both & y and is always non-negative The total volume under the graph o JPDF must be unity: y, (, ) dd 1 2 The PDF or single RV(X) can be obtained rom its JPDF with a second RV(Y): F (, y ) (, ) dd Poornima.G, Associate Proessor, BMSCE, B lore 7
Dierentiating both sides w.r.t : (, y ) (, ) d Thus PDF () may be obtained rom JPDF XY (, by integrating over all values o the undesired RV(Y).The PDF () & Y (Y) are called marginal density. Hence JPDF XY (, contains all possible inormation about JRV X&Y. The conditional PDF o Y given that X= is deined by: Provided that ()> 0, where () is the marginal density o X. The unction Y ( y X ) may be thought o as a unction o the variable y. According it satisies all the requirements o an ordinary probability density unction as shown by And, y (, Y ( y X ) ( ) Y ( y X ) 0 Y ( y X ) dy 1 I the random variables X&Y are statistically independent, the knowledge o the outcome o X can in no way eect the distribution o Y.The result is the condition probability Y ( y X density ) reduces to the marginal density Y ( as shown by: ( y X ) ( Y Y The JPDF o RV can be epressed as the product o their respective marginal densities: X, Y (, X ( ) Y ( This relation holds only when the RV X&Y are statistically independent. Statistical Averages Its important in the measurement o quantities that are obscured by random variations. The mean or epected value o a random variable X is commonly deined by: m E[ X ] ( ) Poornima.G, Associate Proessor, BMSCE, B lore 8
E - Epectation operator; m - locates the center o gravity o the area under the probability density curve o random variable X. The mean unction o X, denoted by g(x), is deined by: E[ g(( X )] g( ) ( ) For the special case o g(x)=x n, the nth movement o probability distribution o the RV X; that is n n E[ X ] ( ) d The most important moments o X are the irst two moments. when n=1 the mean o the RV is obtained and n=2 gives the mean-square value o X 2 2 E[ X ] ( ) d The moments o the dierence between a random variable X and its mean m. The nth central moment is E[( X m n n ) ] ( m) ( ) For n=1, the central moment is zero. For n=2 the second central moment is the variance o RV. Var[ X ] E[( X m d 2 2 ) ] ( m) ( ) The variance o the random variable X is commonly denote as. The square root o the variance is called the standard deviation o the 2 random variable X. The variance o the random variable X is in some sense a measure o the variables dispersion. By speciying variance we essentially constraints eective width o the PDF () o the RV X about the mean m.the characteristics unction (v) o the probability distribution o the RV X is another important statistical average. Its deined as the epectation o ep(jvx): ( v) E[ep( jvx)] ( )ep( jv d v and play analogous roles to the variables 2π and t o Fourier transorm, the ollowing inverse can be deduced. ( ) 21 ( v)ep( jv) Poornima.G, Associate Proessor, BMSCE, B lore 9 d ) dv 2
This relation may be used to evaluate the PDF () o the RV X. Random Process: A (one-dimensional) random process is a (scalar) unction y(t), where t is usually time, or which the uture evolution is not determined uniquely by any set o initial data or at least by any set that is knowable to you and me. In other words, \random process" is just a ancy phrase that means \unpredictable unction". Random processes y take on a continuum o values ranging over some interval, oten but not always - to +. The generalization to y's with discrete (e.g., integral) values is straightorward. Eamples o random processes are: (i ) the total energy E(t) in a cell o gas that is in contact with a heat bath; (ii ) the temperature T(t) at the corner o Main Street and Center Street in Logan, Utah; (iii ) the earth-longitude _(t) o a speciic oygen molecule in the earth's atmosphere. One can also deal with random processes that are vector or tensor unctions o time. Ensembles o random processes. Since the precise time evolution o a random process is not predictable, i one wishes to make predictions one can do so only probabilistically. The oundation or probabilistic predictions is an ensemble o random processes i.e., a collection o a huge number o random processes each o which behaves in its own, unpredictable way. The probability density unction describes the general distribution o the magnitude o the random process, but it gives no inormation on the time or requency content o the process. Ensemble averaging and Time averaging can be used to obtain the process properties Ensemble averaging : Properties o the process are obtained by averaging over a collection or ensemble o sample records using values at corresponding times Time averaging :Properties are obtained by averaging over a single record in time Poornima.G, Associate Proessor, BMSCE, B lore 10
Stationary random processes: A random process is said to be stationary i its statistical characterization is independent o the observation interval over which the process was initiated. Ensemble averages do not vary with time. An ensemble o random processes is said to be stationary i and only i its probability distributions p n depend only on time dierences, not on absolute time: p n (y n ; t n + τ ; : : : ; y 2 ; t 2 + τ ; y 1 ; t 1 + τ ) = p n (y n ; t n ; : : : ; y 2 ; t 2 ; y 1 ; t 1 ): I this property holds or the absolute probabilities p n. Most stationary random processes can be treated as ergodic. A random process is ergodic i every member o the process carries with it the complete statistics o the whole process. Then its ensemble averages will equal appropriate time averages. O necessity, an ergodic process must be stationary, but not all stationary processes are ergodic. Nonstationary random processes: Nonstationary random processes arise when one is studying a system whose evolution is inluenced by some sort o clock that cares about absolute time. For eample, the speeds v(t) o the oygen molecules in downtown Logan, Utah make up an ensemble o random processes regulated in part by the rotation o the earth and the orbital motion o the earth around the sun; and the inluence o these clocks makes v(t) be a nonstationary random process. By contrast, stationary random processes arise in the absence o any regulating clocks. An eample is the speeds v (t) o oygen molecules in a room kept at constant temperature. Ergodic Process: Stationary process in which averages rom a single record are the same as those obtained rom averaging over the ensemble. I every member o the process carries with it the complete statistics o the whole process then it is called ergodic process. Ensemble averages will equal appropriate time averages. But not all stationary processes are ergodic. Partial Description o random process The mean o the process is constant. Poornima.G, Associate Proessor, BMSCE, B lore 11
The autocorrelation unction o the process is independent o the shit o the time origin. The autocorrelation unction at a lag o zero is inite. Random process that is not strictly stationary but or which these conditions hold is said to be Wide Sense Stationary. Properties o Autocorrelation Even unction o the time lag R X ( ) R X ( ) Mean square valve equals the autocorrelation unction o the process or zero time lag R (0) [ X 2 X E ( t)] autocorrelation unction has maimum magnitude at zero time lag R X ( ) RX (0) Interdependence o two random variables obtained by observing a random process at time X (t) changes with time the more rapidly will autocorrelation unction decrease rom its maimum Properties o Gaussian Process 1. I a Gaussian process X(t) is applied to a stable linear ilter then the random process Y(t) developed at the output o the ilter is also Gaussian. 2. I Gaussian process is wide sense stationary then the process is also stationary in the strict sense. 3. Samples X(t1),X(t2), X(tn) o RV X(t) at t1,t2, tn.i X(t) is Gaussian then the set o RV are jointly Gaussian or any n. 4. 4. I the set o RV X(t1),X(t2), X(tn) the Samples o RV X(t) at times t1,t2, tn. Are uncorrelated then the set o RV are statistically independent Central Limit Theorem X 1,X 2,X 3,.X n be a set o RV that satisies the ollowing requirements: X k with k=1, 2,..n are statistically independent. X k all have the same pd Mean and Variance eist or each X k Poornima.G, Associate Proessor, BMSCE, B lore 12
I RV Y is deined by Y n X k k 1 According to the Central limit theorem The standardized RV approaches a Gaussian RV with zero mean and variance as the number o RV X 1,X 2,X 3,.X n increases without limit. Y E[ Y ] Z Mean and Variance related to the moments are n E[ Y ] E[ X k ] and Var[ Y] Var[ k 1 Y Gives only limiting orm o distribution unction o standardized sum Z as n tends to ininity. n k 1 X k ] Poornima.G, Associate Proessor, BMSCE, B lore 13