3F1 Random Processes Examples Paper (for all 6 lectures)

Size: px
Start display at page:

Download "3F1 Random Processes Examples Paper (for all 6 lectures)"

Transcription

1 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories B and C each supply a quarter of the total number. It is known that 5% of components from factory A are defective, while 4% and % are defective from factories B and C, respectively. Define a suitable Sample Space (set of possible outcomes) for this problem and mark these outcomes as regions on a Venn Diagram. Identify the regions Made at Factory C and Component is faulty on the Venn Diagram. A randomly selected component at the depot is found to be faulty. What is the probability that the component was made by factory C?. In a digital communication system, the messages are encoded into binary symbols and. Because of noise in the system, the incorrect symbol is sometimes received. Suppose the probability of a being transmitted is.4 and the probability of a being transmitted is.6. Further suppose that the probability of a transmitted being received as a is.8 and the probability of a transmitted being received as a is.5. Find: (a) The probability that a received was transmitted as a. (b) The probability that a received was transmitted as a. (c) The probability that any symbol is received in error. 3. A random variable has a Cumulative Distribution Function (CDF) given by: Find: F X (x) (a) The probability that X >.5 (b) The probability that X.5 (c) The probability that.3 < X.7 { < x e x x <

2 4. Sketch the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) for each of the following distributions (in parts (b) and (c) you will first need to calculate the normalizing constants c, and you may use a computer for these plots if you wish). Determine the mean µ and standard deviation σ of each, and mark µ and µ ± σ on the PDF sketches. Determine also the modes (locations of maximum probability) for the continuous pdfs (b) and (c). (a) Discrete Random Variable with three possible values: Pr{X.5} /5, Pr{X } /5 and Pr{X +} /5 (b) Laplace Distribution: f X (x) c exp( x ), < x < + (c) Rayleigh Distribution: f X (x) c x exp( x /), x < + 5. A real-valued random variable U is distributed as a Gaussian with mean zero and variance σ. A new random variable X g(u) is defined as a function of U. Determine and sketch the probability density function of X when g(.) takes the following forms: (a) g(u) U (b) g(u) U U + a for U < a (c) g(u) for a U a (a is a positive constant) U a for U > a 6. A darts player, aiming at the bullseye, measures the accuracy of shots in terms of x and y coordinates relative to the centre of the dartboard. It is found from many measurements that the x- and y-values of the shots are independent and normally distributed with mean zero and standard deviation σ. Write down the joint probability density function for the x and y measurements and write down an integral expression which gives the probability that x lies between a and b and y lies between c and d. Show that the Cumulative Distribution Function (CDF) for R, the radial distance of shots from the centre of the dartboard, is given by: F R (r) exp( r /(σ )), r < Determine the Probability Density Function for R. What type of distribution is this? If σ 3cm, what is the probability that the player s next shot hits the bullseye, which has a diameter of cm?

3 3 7. Two random variables X and Y have a joint Probability Density Function given by: f XY (x, y) Determine: { kxy x, y otherwise (a) The value of k which makes this a valid Probability Density Function. (b) The probability of the event X / AND Y > /. (c) The marginal densities f X (x) and f Y (y). (d) The conditional density f Y X (y x). (e) Whether X and Y are independent. 8. A particular Random Process {X(t)} has an ensemble which consists of four sample functions: X(t, α ), X(t, α ), X(t, α 3 ) sin(πt), X(t, α 4 ) cos(πt) where {α, α, α 3, α 4 } is the set of possible outcomes in some underlying Random Experiment, each of which occurs with equal probability. Determine the first order Probability Density Function (PDF) f X(t) (x) and the expected value of the process E{X(t)}. Is the process Strict Sense Stationary? 9. For each of the following random processes, sketch a few members of the ensemble, determine the expected values and autocorrelation functions of the processes and state which of them are wide sense stationary (WSS): (a) X(t) A (b) A is uniformly distributed between and. (c) X(t) cos(πf t + Φ) f is fixed; Φ is uniformly distributed between and ϕmax. This process is in general non-stationary. Are there any values of ϕmax for which the process is WSS? X(t) A cos(πf t + Φ) f is fixed; Φ is uniformly distributed between and π; A is uniformly distributed between and ; Φ and A are statistically independent. Is the process in part (a) Mean Ergodic?

4 4. Show the following results for a wide-sense stationary (WSS), real-valued random process {X(t)} with autocorrelation function r XX (τ) and power spectrum S X (ω): (a) r XX ( τ) r XX (+τ) ; (b) If {X(t)} represents a random voltage across a Ω resistance, the average power dissipated is P av r XX () ; (c) S X (ω) S X ( ω) ; (d) S X (ω) is real-valued; (e) P av S X (ω) dω. π. A white noise process {X(t)} is a Wide Sense Stationary, zero mean process with autocorrelation function: r XX (τ) σ δ(τ) where δ(τ) is the delta-function centred on τ whose area is unity and width is zero. Sketch the Power Spectrum for this process. A sample function X(t) from such a white noise process is applied as the input to a linear system whose impulse response is h(t). The output is Y (t). Derive expressions for the output autocorrelation function r Y Y (τ) E[Y (t) Y (t+τ)] and the cross-correlation function between the input and output r XY (τ) E[X(t) Y (t + τ)]. Hence obtain an expression for the output power spectrum S Y (ω) in terms of σ and the frequency response H(ω) of the linear system. If the process is correlation ergodic, suggest in block diagram form a scheme for measurement of r XY (τ). What is a possible application for such a scheme?. It is desired to predict a future value X(t + T ) of a WSS random process {X(t)} from the current value X(t) using the formula: where c is a constant to be determined. ˆX(t + T ) c X(t) If the process has autocorrelation function r XX (τ), show that the value of c which leads to minimum mean squared error between X(t + T ) and ˆX(t + T ) is c r XX(T ) r XX () Hence obtain an expression for the expected mean squared error in this case. If {X(t)} has non-zero mean, suggest an improved formula for ˆX(t + T ), giving reasons. Nick Kingsbury; October 9,

5 5 Tripos questions: This 3F course has been given since 3 and is similar in content to the Random Signals Course from the old third-year paper E4, and questions on random processes and probability from E4 Tripos papers up to will be also relevant. Answers:. /8. (a).95 (b).947 (c).6 3. (a).67 (b). (c) (a) µ.3; σ.6. (b) c.5; µ ; σ.44; mode. (c) c ; µ π.533; σ π 5. (a) πσ exp ( x σ ) for x < + ; elsewhere..655; mode. (b) exp ( ) x πxσ σ for x < + ; elsewhere. exp ( ) (x a) πσ for < x < σ (c) (Φ(a/σ) )δ(x) for x where Φ(x) x exp ( ) π exp( u /) du. (x+a) πσ for < x < σ 6. f R (r) r exp ( ) r σ σ for r < ; elsewhere. Pr(bullseye) (a) 4 (b) 3/6 { x (c) f X (x) x elsewhere, f Y (y) { y y (d) f Y X (y x) elsewhere { y y elsewhere 8. f X(t) (x) [δ(x ) + δ(x + ) + δ(x sin(πt)) + δ(x cos(πt))] 4 E[X(t)] [sin(πt) + cos(πt) ] 4

6 6 9. (a) E[X(t)] / ; E[X(t )X(t )] /3.. (b) E[X(t)] [sin(ω t + ϕ max ) sin(ω t)]/ϕ max ; E[X(t )X(t )] [sin(ω (t + t ) + ϕ max ) sin(ω (t + t )) + ϕ max cos(ω (t t ))] / 4ϕ max Not WSS unless ϕ max nπ. (c) E[X(t)] ; E[X(t )X(t )] 6 cos(ω (t t )). S X (ω) σ r Y Y (τ) σ h(τ) h( τ) r XY (τ) σ h(τ) S Y (ω) σ H(ω) ( is convolution). Minimum MSE r XX() r XX(T ) r XX ()

7 3F Random Processes Examples Paper Solutions 3F Random Processes Examples Paper Solutions. Sample space: {AF, AF, BF, BF, CF, CF } where {A, B, C} are the factories and {F, F } are the faulty / not-faulty states. AF BF CF Venn diagram: AF BF CF (regions may be drawn with areas proportional to probabilities) Top row Made at C ; right column Component is faulty. Prob. that faulty component was made at C: P (C F ) P (F C) P (C) P (F ) P (F C) P (C) P (F A) P (A) + P (F B) P (B) + P (F C) P (C)./4.5/ +.4/4 +./ Let {t, t} and {r, r} be the pairs of transmit and receive states.... P (t).4, P (t).6 and P (r t).8, P (r t).5 (a) (b) P (t r) P (t r) P (r t) P (t) P (r) ( P (r t)) P (t) ( P (r t)) P (t) + P (r t)) P (t) (.8).4 (.8) P (r t) P (t) P (r) ( P (r t)) P (t) ( P (r t)) P (t) + P (r t)) P (t) (.5).6 (.5)

8 3F Random Processes Examples Paper Solutions (c) P (error) P (r t) P (t) + P (r t) P (t) (a) Pr{X >.5} F X (.5) e (b) Pr{X.5} F X (.5) e.5. (c) Pr{.3 < X.7} F X (.7) F X (.3) e.3 e (a) E[X] i x i p i E[X ] x i p i i σ E[X ] (E[X]) (a) pdf of discrete RV (b) cdf of discrete RV.4 µ σ µ µ+σ X X (b) Normalising constant c must make the area under the pdf unity. f X (x) dx c e x dx c e x dx c [ e x] c... c.5

9 3F Random Processes Examples Paper Solutions 3 (c) pdf of Laplace distribution (d) cdf of Laplace distribution.5 µ σ µ µ+σ x E[X] σ E[X ] x f X (x) dx Use integration by parts: x f X (x) dx x x c e x dx from symmetry of f X x c e x dx c x e x dx [ x e x] + x e x dx + [ xe x] + e x dx + + [ e x]... σ c. Mode is where f X is max; i.e. at x (by inspection). x e x dx (c) Normalising constant c must make the area under the pdf unity. f X (x) dx c x e x / dx c e u du (if u x /) c [ e u] c... c (e) pdf of Rayleigh distribution (f) cdf of Rayleigh distribution.6 µ σ µ µ+σ x x

10 4 3F Random Processes Examples Paper Solutions E[X] x f X (x) dx x (x e x / ) dx u / e u du Γ( 3) π π.533 E[X ]... σ x (x e x / ) dx E[X ] (E[X]) u e u du Γ() π.655 Mode is where gradient of pdf : Hence mode is at x d f X dx e x / x(x e x / ) when x 5. pdf of U: f U (u) πσ e u /σ cdf of U: F U (u) where Q(x) x u (a) X g(u) U : f U (α) dα Q(u/σ) π e α / dα is the Gaussian Error Integral function. F X (x) P ({u : u x}) P ({u : x u x}) F U (x) F U ( x) for < x < F U (x) [ F U (x)] F U (x)... f X (x) d dx F X(x) d dx F U(x) for < x < f U (x) πσ e x /σ for < x < (a) pdf of X g(u) U (b) pdf of X g(u) U x / sigma x / sigma

11 3F Random Processes Examples Paper Solutions 5 (b) X g(u) U : F X (x) P ({u : u x}) P ({u : x u x}) F U ( x) F U ( x) for < x < F U ( x)... f X (x) d F dx X(x) d F dx U( x) for < x < x f U( x) πσ x e x/σ for < x < U + a for U < a (c) X g(u) for a U a U a for U > a Consider the 3 regions separately: i. U < a, X < : F X (x) P ({u : u + a x}) P ({u : u x a}) F U (x a)... f X (x) f U (x a) πσ e (x a) /σ for x < ii. a U a, X : Pr{X } P ({u : a u a}) F U (a) F U ( a) F U (a) This is a probability mass at x, so iii. U > a, X > : f X (x) [F U (a) ]δ(x) [ Q(a/σ)]δ(x) for x F X (x) P ({u : u a x}) P ({u : u x + a}) F U (x + a)... f X (x) f U (x + a) πσ e (x+a) /σ for x > (c) pdf of X g(u) U+a,, U a; if a sigma x / sigma

12 6 3F Random Processes Examples Paper Solutions 6. X and Y each have a zero-mean Gaussian pdf given by f X (x) πσ e x /σ and f Y (y) Since X and Y are independent, the joint pdf is given by πσ e y /σ f X,Y (x, y) f X (x) f Y (y) +y )/σ e (x πσ Hence Pr{a < X b, c < Y d} b xa d πσ yc b f X,Y (x, y) dy dx xa d yc e (x +y )/σ dy dx In polar coordinates so the above expression converts to x r cos θ and y r sin θ Pr{r < R r, θ < θ θ } r πσ θ θ πσ θ r /σ e r θ r r /σ r r e r dθ dr To get the cdf for R, we integrate over all angles so that θ θ π, and we integrate from r to r r to get F R (r) Pr{ < R r, π < θ π} π r r e r /σ dr πσ [ e r /σ ] r /σ e r for r < This is a Rayleigh distribution. For a cm diameter bullseye, max radius cm, and if σ 3cm: Pr{bullseye} Pr{R cm} F R () e /( 9).544 dr

13 3F Random Processes Examples Paper Solutions 7 7. (a) Valid pdf: [ ] x [ ] y kxy dx dy k... k 4 k 4 (b) (c) (d) (e) Pr{X.5, Y >.5} f X (x).5 x.5 x f XY (x, y) dy y.5 4x 4xy dy [ xy ] 4xy dy dx 3 4 dx 3 y for x < and x >.5 [ ] x.5 x x [ 4xy ] 3 6 x for x f Y (y) y for y (by symmetry) for y < and y > f Y X (y x) f XY (x, y) f X (x) y f X (x) f Y (y) x y 4xy f XY (x, y) Therefore X and Y are independent, by definition. y.5 dx 8. For any particular t, X(t) can take one of the 4 values, X(t, α i ), each with equal probability. Hence: f X(t) (x) [δ(x ) + δ(x + ) + δ(x sin(πt)) + δ(x cos(πt))] 4... E[X(t)] x f X(t) (x) dx [ + sin(πt) + cos(πt)] 4 4 [sin(πt) + cos(πt) ] Since f X(t) (x) and E[X(t)] depend on t, the process is not SSS.

14 8 3F Random Processes Examples Paper Solutions 9. (a) X(t) A: [ ] a E[X(t)] X(t, a) f A (a) da a da r XX (t, t ) E[X(t ) X(t )] X(t, a) X(t, a) f A (a) da a da 3 These are independent of t, hence the process is WSS. (b) X(t) cos(πf t + Φ): E[X(t)] X(t, ϕ) f Φ (ϕ) dϕ [ sin(ω t + ϕ) ϕ max ] ϕmax ϕmax X(t, ϕ) dϕ ϕ max sin(ω t + ϕ max ) sin(ω t) ϕ max ϕmax cos(ω t + ϕ) ϕ max dϕ r XX (t, t ) E[X(t ) X(t )] X(t, ϕ) X(t, ϕ) f Φ (ϕ) dϕ ϕmax ϕmax ϕ max cos(ω t + ϕ) cos(ω t + ϕ) f Φ (ϕ) dϕ [cos(ω (t + t ) + ϕ) + cos(ω (t t ))] [ ϕmax sin(ω (t + t ) + ϕ)] ϕ max dϕ + ϕ max ϕ max cos(ω (t t )) 4ϕ max [sin(ω (t + t ) + ϕ max ) sin(ω (t + t )) + ϕ max cos(ω (t t ))] Mean and r XX depend on t, so the process is not in general WSS. However if ϕ max nπ (integer n), mean is zero and hence is independent of t. If ϕ max nπ, r XX cos(ω (t t )) which depends only on τ (t t ). If both of the above are satisfied then the process is WSS. Hence the process is not WSS unless ϕ max nπ (integer n). (c) X(t) A cos(πf t + Φ): Ensemble is now any sample function from (a) times any from (b). Think now of a multivariate sample space which represents pairs {A, Φ}. Since A and Φ are independent: f A,Φ (a, ϕ) f A (a) f Φ (ϕ)

15 3F Random Processes Examples Paper Solutions 9 Hence Similarly E[X(t)] X(t, a, ϕ) f A,Φ (a, ϕ) da dϕ a cos(ω t + ϕ) f A (a) f Φ (ϕ) da dϕ π a da cos(ω t + ϕ) dϕ π from (a) and (b): 4π [sin(ω t + π) sin(ω t)] E[X(t ) X(t )] a cos(ω t + ϕ) cos(ω t + ϕ) f A (a) f Φ (ϕ) da dϕ a da π π cos(ω t + ϕ) cos(ω t + ϕ) dϕ from (a) and (b): 3 cos(ω (t t )) 6 cos(ω τ), τ t t Hence this process is WSS. Is the process in (a) Mean Ergodic? The time average is the mean over t for a single realisation a: E t [X(t, a)] lim T T T T X(t, a) dt lim T T T T T a a dt lim T T The ensemble average is the mean over A for a single time instant t: E A [X(t, a)] X(t, a) f A (a) da from (a) a These two averages are not the same, so the process in (a) is (even though it is stationary). not ergodic

16 3F Random Processes Examples Paper Solutions. (a) r XX (τ) E[X(t) X(t + τ)] E[X(t + τ) X(t)] Let t t + τ: E[X(t ) X(t τ)] r XX ( τ) (b) Instantaneous power V /R X (t)/ Average power, P av E[X (t)] r XX () (c) S X (ω) Let τ τ: from (a): S X ( ω) r XX (τ) e jωτ dτ r XX ( τ ) e jωτ dτ r XX (τ ) e jωτ dτ (d) S X (ω) τ τ in st integral: from (a): r XX (τ) e jωτ dτ r XX (τ) e jωτ dτ + r XX ( τ) e jωτ dτ + r XX (τ) [e jωτ + e jωτ ] dτ r XX (τ) cos(ωτ) dτ All terms in the integral are real, so the result is real. r XX (τ) e jωτ dτ r XX (τ) e jωτ dτ Real (e) From (b): P av r XX () S X (ω) e jω dω S X (ω) dω π π

17 3F Random Processes Examples Paper Solutions. Power Spectrum: S X (ω) r XX (τ) e jωτ dτ σ δ(τ) e jωτ dτ σ e jω σ Hence the power spectrum is a constant, σ, over all frequencies ω. For the linear system: Y (t) h(t) X(t) h(β) X(t β) dβ (convolution) Autocorrelation function: r Y Y (τ) E[Y (t)y (t + τ)] [( ) ( E h(β )X(t β ) dβ )] h(β )X(t + τ β ) dβ [ E ] h(β ) h(β ) X(t β ) X(t + τ β ) dβ dβ h(β ) h(β ) E [X(t β ) X(t + τ β )] dβ dβ h(β ) h(β ) r XX (τ + β β ) dβ dβ h(β ) h(β ) σ δ(τ + β β ) dβ dβ σ h(β τ) h(β ) dβ σ h( τ) h(τ) Cross-correlation function: r XY (τ) E[X(t)Y (t + τ)] [ ( )] E X(t) h(β)x(t + τ β) dβ [ ] E h(β) X(t) X(t + τ β) dβ h(β) E [X(t) X(t + τ β)] dβ h(β) r XX (τ β) dβ h(β) σ δ(τ β) dβ σ h(τ) Taking Fourier Transforms of the above expression for r Y Y (τ): since H (ω) is the transform of h( τ). S Y (ω) σ H (ω) H(ω) σ H(ω)

18 3F Random Processes Examples Paper Solutions If the process is Correlation Ergodic, the correlations may be estimated from expectations over time:... r XY (τ) lim T T T T for some suitably large choice of T. X(t) Y (t + τ) dt T T T X(t τ) Y (t) dt The integration over T can be approximated by a lowpass filter with an impulse response whose main lobe is of duration roughly T between its half-amplitude points. The block diagram should therefore show X(t) being passed through a delay of τ, then multiplied by Y (t), and the result being passed through the lowpass filter to produce ˆr XY (τ), which is the estimate of r XY (τ). This is known as a Correlometer. If the delay τ is varied slowly compared with the T response time of the filter, then the full function ˆr XY (τ) can be obtained. The main application of this scheme is System Identification, in which the impulse response of the system is estimated from ĥ(τ) ˆr XY (τ) σ. Instantaneous error: Mean square error (MSE): ε(t) X(t + T ) ˆX(t + T ) X(t + T ) c X(t) P ε E[ε (t)] E[X (t + T ) c X(t + T ) X(t) + c X (t)] since X is WSS: ( + c ) E[X (t)] c E[X(t + T ) X(t)] Differentiating to find min: So the min MSE is: d P ε dc... c min ( + c ) r XX () c r XX (T ) c r XX () r XX (T ) at min r XX (T ) r XX () P ε,min ( + c min) r XX () c min r XX (T ) ( ) + r XX(T ) r rxx() XX () r XX(T ) r XX () r XX() + rxx(t ) rxx(t ) r XX () r XX() r XX(T ) r XX ()

19 3F Random Processes Examples Paper Solutions 3 To see the effect of a non-zero mean, assume that X(t) is zero-mean, and that the non-zero mean process is Y (t) X(t) + µ. Hence r Y Y (τ) E[Y (t + T ) Y (t)] E[(X(t + T ) + µ) (X(t) + µ)] r XX (τ) + µ Using the given formula to predict Y (t + T ) from Y (t) results in an MSE of P ε,min r Y Y () r Y Y (T ) r Y Y () (r Y Y () + r Y Y (T ))(r Y Y () r Y Y (T )) r Y Y () (r XX() + µ + r XX (T ) + µ )(r XX () r XX (T )) r XX () + µ ( + r XX(T ) + µ ) (r r XX () + µ XX () r XX (T )) Because r XX (T ) < r XX (), we see that the addition of µ to r XX will always increase P ε,min if the original predictor is used. Hence we get the minimum MSE if our predictor is applied to a zero-mean process. To optimally predict the non-zero mean process Y (t), we should therefore subtract the mean µ and then apply the predictor to the zero-mean remainder X(t), to give: Ŷ (t + T ) µ c min (Y (t) µ) or Ŷ (t + T ) c min Y (t) + ( c min )µ where c min r XX(T ) r XX () r Y Y (T ) µ r Y Y () µ This will then leave P ε,min unaffected by changes in µ, the mean component of Y.

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

Signals and Spectra (1A) Young Won Lim 11/26/12

Signals and Spectra (1A) Young Won Lim 11/26/12 Signals and Spectra (A) Copyright (c) 202 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version.2 or any later

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

Spectral Analysis of Random Processes

Spectral Analysis of Random Processes Spectral Analysis of Random Processes Spectral Analysis of Random Processes Generally, all properties of a random process should be defined by averaging over the ensemble of realizations. Generally, all

More information

ECE-340, Spring 2015 Review Questions

ECE-340, Spring 2015 Review Questions ECE-340, Spring 2015 Review Questions 1. Suppose that there are two categories of eggs: large eggs and small eggs, occurring with probabilities 0.7 and 0.3, respectively. For a large egg, the probabilities

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit   dwm/courses/2tf Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

EE 3054: Signals, Systems, and Transforms Summer It is observed of some continuous-time LTI system that the input signal.

EE 3054: Signals, Systems, and Transforms Summer It is observed of some continuous-time LTI system that the input signal. EE 34: Signals, Systems, and Transforms Summer 7 Test No notes, closed book. Show your work. Simplify your answers. 3. It is observed of some continuous-time LTI system that the input signal = 3 u(t) produces

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

Signal Processing Signal and System Classifications. Chapter 13

Signal Processing Signal and System Classifications. Chapter 13 Chapter 3 Signal Processing 3.. Signal and System Classifications In general, electrical signals can represent either current or voltage, and may be classified into two main categories: energy signals

More information

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6 MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

2A1H Time-Frequency Analysis II

2A1H Time-Frequency Analysis II 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 209 For any corrections see the course page DW Murray at www.robots.ox.ac.uk/ dwm/courses/2tf. (a) A signal g(t) with period

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

TSKS01 Digital Communication Lecture 1

TSKS01 Digital Communication Lecture 1 TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

3F1 Random Processes Course

3F1 Random Processes Course 3F1 Random Processes Course (supervisor copy) 1 3F1 Random Processes Course Nick Kingsbury October 6, 1 Contents 1 Probability Distributions 5 1.1 Aims and Motivation for the Course.........................

More information

Homework 3 (Stochastic Processes)

Homework 3 (Stochastic Processes) In the name of GOD. Sharif University of Technology Stochastic Processes CE 695 Dr. H.R. Rabiee Homework 3 (Stochastic Processes). Explain why each of the following is NOT a valid autocorrrelation function:

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y) Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 1 ECE6604 PERSONAL & MOBILE COMMUNICATIONS Week 3 Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 2 Multipath-Fading Mechanism local scatterers mobile subscriber base station

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

ECE Homework Set 2

ECE Homework Set 2 1 Solve these problems after Lecture #4: Homework Set 2 1. Two dice are tossed; let X be the sum of the numbers appearing. a. Graph the CDF, FX(x), and the pdf, fx(x). b. Use the CDF to find: Pr(7 X 9).

More information

Lecture 11: Probability, Order Statistics and Sampling

Lecture 11: Probability, Order Statistics and Sampling 5-75: Graduate Algorithms February, 7 Lecture : Probability, Order tatistics and ampling Lecturer: David Whitmer cribes: Ilai Deutel, C.J. Argue Exponential Distributions Definition.. Given sample space

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

Random Processes Handout IV

Random Processes Handout IV RP-IV.1 Random Processes Handout IV CALCULATION OF MEAN AND AUTOCORRELATION FUNCTIONS FOR WSS RPS IN LTI SYSTEMS In the last classes, we calculated R Y (τ) using an intermediate function f(τ) (h h)(τ)

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

MATHS & STATISTICS OF MEASUREMENT

MATHS & STATISTICS OF MEASUREMENT Imperial College London BSc/MSci EXAMINATION June 2008 This paper is also taken for the relevant Examination for the Associateship MATHS & STATISTICS OF MEASUREMENT For Second-Year Physics Students Tuesday,

More information

7 The Waveform Channel

7 The Waveform Channel 7 The Waveform Channel The waveform transmitted by the digital demodulator will be corrupted by the channel before it reaches the digital demodulator in the receiver. One important part of the channel

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 08 December 2009 This examination consists of

More information

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise

More information

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 6 December 2006 This examination consists of

More information

Fig 1: Stationary and Non Stationary Time Series

Fig 1: Stationary and Non Stationary Time Series Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B) REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two

More information

E2.5 Signals & Linear Systems. Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & 2)

E2.5 Signals & Linear Systems. Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & 2) E.5 Signals & Linear Systems Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & ) 1. Sketch each of the following continuous-time signals, specify if the signal is periodic/non-periodic,

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Probability review. September 11, Stoch. Systems Analysis Introduction 1

Probability review. September 11, Stoch. Systems Analysis Introduction 1 Probability review Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ September 11, 2015 Stoch.

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 4 Ergodic Random Processes, Power Spectrum Linear Systems 0 c 2011, Georgia Institute of Technology (lect4 1) Ergodic Random Processes An ergodic random process is one

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Geometry and Motion Selected answers to Sections A and C Dwight Barkley 2016

Geometry and Motion Selected answers to Sections A and C Dwight Barkley 2016 MA34 Geometry and Motion Selected answers to Sections A and C Dwight Barkley 26 Example Sheet d n+ = d n cot θ n r θ n r = Θθ n i. 2. 3. 4. Possible answers include: and with opposite orientation: 5..

More information

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or Physics 7b: Statistical Mechanics Brownian Motion Brownian motion is the motion of a particle due to the buffeting by the molecules in a gas or liquid. The particle must be small enough that the effects

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Fundamentals of Noise

Fundamentals of Noise Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency

More information

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1 ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes

More information

Today s lecture. Local neighbourhood processing. The convolution. Removing uncorrelated noise from an image The Fourier transform

Today s lecture. Local neighbourhood processing. The convolution. Removing uncorrelated noise from an image The Fourier transform Cris Luengo TD396 fall 4 cris@cbuuse Today s lecture Local neighbourhood processing smoothing an image sharpening an image The convolution What is it? What is it useful for? How can I compute it? Removing

More information

Each problem is worth 25 points, and you may solve the problems in any order.

Each problem is worth 25 points, and you may solve the problems in any order. EE 120: Signals & Systems Department of Electrical Engineering and Computer Sciences University of California, Berkeley Midterm Exam #2 April 11, 2016, 2:10-4:00pm Instructions: There are four questions

More information

Chapter Review of of Random Processes

Chapter Review of of Random Processes Chapter.. Review of of Random Proesses Random Variables and Error Funtions Conepts of Random Proesses 3 Wide-sense Stationary Proesses and Transmission over LTI 4 White Gaussian Noise Proesses @G.Gong

More information

Stochastic Process II Dr.-Ing. Sudchai Boonto

Stochastic Process II Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES 13.42 READING 6: SPECTRUM OF A RANDOM PROCESS SPRING 24 c A. H. TECHET & M.S. TRIANTAFYLLOU 1. STATIONARY AND ERGODIC RANDOM PROCESSES Given the random process y(ζ, t) we assume that the expected value

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Signals and Spectra - Review

Signals and Spectra - Review Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs

More information

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability Appendi A PROBABILITY AND RANDOM SIGNALS Deterministic waveforms are waveforms which can be epressed, at least in principle, as an eplicit function of time. At any time t = t, there is no uncertainty about

More information

Probability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.

Probability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables. Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of

More information

LTI Systems (Continuous & Discrete) - Basics

LTI Systems (Continuous & Discrete) - Basics LTI Systems (Continuous & Discrete) - Basics 1. A system with an input x(t) and output y(t) is described by the relation: y(t) = t. x(t). This system is (a) linear and time-invariant (b) linear and time-varying

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Random Variables. Will Perkins. January 11, 2013

Random Variables. Will Perkins. January 11, 2013 Random Variables Will Perkins January 11, 2013 Random Variables If a probability model describes an experiment, a random variable is a measurement - a number associated with each outcome of the experiment.

More information

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

Fourier Series. Spectral Analysis of Periodic Signals

Fourier Series. Spectral Analysis of Periodic Signals Fourier Series. Spectral Analysis of Periodic Signals he response of continuous-time linear invariant systems to the complex exponential with unitary magnitude response of a continuous-time LI system at

More information