TMA 4265 Stochastic Processes

Similar documents
Stochastic models and their distributions

Homework 4 (Stats 620, Winter 2017) Due Tuesday Feb 14, in class Questions are derived from problems in Stochastic Processes by S. Ross.

Introduction to Probability and Statistics Slides 4 Chapter 4

5. Stochastic processes (1)

An random variable is a quantity that assumes different values with certain probabilities.

Reliability of Technical Systems

Continuous Time Markov Chain (Markov Process)

Discrete Markov Processes. 1. Introduction

Vehicle Arrival Models : Headway

Lecture 4: Processes with independent increments

Answers to QUIZ

in Engineering Prof. Dr. Michael Havbro Faber ETH Zurich, Switzerland Swiss Federal Institute of Technology

Stochastic Modelling in Finance - Solutions to sheet 8

Basic notions of probability theory (Part 2)

Object tracking: Using HMMs to estimate the geographical location of fish

Foundations of Statistical Inference. Sufficient statistics. Definition (Sufficiency) Definition (Sufficiency)

Christos Papadimitriou & Luca Trevisan November 22, 2016

Lecture 4 Notes (Little s Theorem)

Double system parts optimization: static and dynamic model

Math 10B: Mock Mid II. April 13, 2016

Approximation Algorithms for Unique Games via Orthogonal Separators

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Basic definitions and relations

MODULE 3 FUNCTION OF A RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES PROBABILITY DISTRIBUTION OF A FUNCTION OF A RANDOM VARIABLE

Part III: Chap. 2.5,2.6 & 12

Stationary Distribution. Design and Analysis of Algorithms Andrei Bulatov

TMA 4265 Stochastic Processes

Transform Techniques. Moment Generating Function

International Journal of Scientific & Engineering Research, Volume 4, Issue 10, October ISSN

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

ECE 510 Lecture 4 Reliability Plotting T&T Scott Johnson Glenn Shirley

ECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0

Homework 10 (Stats 620, Winter 2017) Due Tuesday April 18, in class Questions are derived from problems in Stochastic Processes by S. Ross.

CS Homework Week 2 ( 2.25, 3.22, 4.9)

TMA4329 Intro til vitensk. beregn. V2017

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

Sensors, Signals and Noise

STAT 430/510: Lecture 15

Answers to Exercises in Chapter 7 - Correlation Functions

THE WAVE EQUATION. part hand-in for week 9 b. Any dilation v(x, t) = u(λx, λt) of u(x, t) is also a solution (where λ is constant).

IS 709/809: Computational Methods in IS Research. Queueing Theory Introduction

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence

Math Final Exam Solutions

Exponential Distribution and Poisson Process

Elements of Stochastic Processes Lecture II Hamid R. Rabiee

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

ECE 510 Lecture 4 Reliability Plotting T&T Scott Johnson Glenn Shirley

Chapter 3 Common Families of Distributions

OBJECTIVES OF TIME SERIES ANALYSIS

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

System of Linear Differential Equations

Continuous Random Variables and Continuous Distributions

u(x) = e x 2 y + 2 ) Integrate and solve for x (1 + x)y + y = cos x Answer: Divide both sides by 1 + x and solve for y. y = x y + cos x

Math Spring Practice for the Second Exam.

Asymptotic Equipartition Property - Seminar 3, part 1

Multivariate distributions

ψ ( t) = c n ( t) t n ( )ψ( ) t ku t,t 0 ψ I V kn

t 2 B F x,t n dsdt t u x,t dxdt

non -negative cone Population dynamics motivates the study of linear models whose coefficient matrices are non-negative or positive.

2 Modern Stochastic Process Methods for Multi-state System Reliability Assessment

Innova Junior College H2 Mathematics JC2 Preliminary Examinations Paper 2 Solutions 0 (*)

Graphical Event Models and Causal Event Models. Chris Meek Microsoft Research

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

Energy Storage Benchmark Problems

Regular Variation and Financial Time Series Models

REVIEW OF MAXIMUM LIKELIHOOD ESTIMATION

20. Applications of the Genetic-Drift Model

Representation of Stochastic Process by Means of Stochastic Integrals

Northwestern University Department of Electrical Engineering and Computer Science

Maintenance Models. Prof. Robert C. Leachman IEOR 130, Methods of Manufacturing Improvement Spring, 2011

Continuous Probability Distributions. Uniform Distribution

Vanishing Viscosity Method. There are another instructive and perhaps more natural discontinuous solutions of the conservation law

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details!

Solutions to Assignment 1

2.3 SCHRÖDINGER AND HEISENBERG REPRESENTATIONS

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

6. Stochastic calculus with jump processes

Non-Asymptotic Theory of Random Matrices Lecture 8: DUDLEY S INTEGRAL INEQUALITY

Biol. 356 Lab 8. Mortality, Recruitment, and Migration Rates

Machine Learning 4771

Comparison between the Discrete and Continuous Time Models

The Strong Law of Large Numbers

556: MATHEMATICAL STATISTICS I

Lecture 3. David Aldous. 31 August David Aldous Lecture 3

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Block Diagram of a DCS in 411

Errata (1 st Edition)

Laplace Transforms. Examples. Is this equation differential? y 2 2y + 1 = 0, y 2 2y + 1 = 0, (y ) 2 2y + 1 = cos x,

. Now define y j = log x j, and solve the iteration.

Heavy Tails of Discounted Aggregate Claims in the Continuous-time Renewal Model

Chapter #1 EEE8013 EEE3001. Linear Controller Design and State Space Analysis

Exam 3 Review (Sections Covered: , )

Stochastic Reservoir Systems with Different Assumptions for Storage Losses

Chapter 4. Location-Scale-Based Parametric Distributions. William Q. Meeker and Luis A. Escobar Iowa State University and Louisiana State University

Oscillation of an Euler Cauchy Dynamic Equation S. Huff, G. Olumolode, N. Pennington, and A. Peterson

Internet Traffic Modeling for Efficient Network Research Management Prof. Zhili Sun, UniS Zhiyong Liu, CATR

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter #1 EEE8013 EEE3001. Linear Controller Design and State Space Analysis

Linear Cryptanalysis

Transcription:

TMA 4265 Sochasic Processes Norges eknisk-naurvienskapelige universie Insiu for maemaiske fag Soluion - Exercise 8 Exercises from he ex book 5.2 The expeced service ime for one cusomer is 1/µ, due o he exponenial disribuion. As he exponenial variables are assumed no o have any memory, he expeced service ime for he cusomer being served as you ener he bank is also 1/µ. The expeced amoun of ime you wai unil i is your urn is herefore 5/µ. You are also expeced o use 1/µ before you are done, hence he expeced ime you will spend in he bank is 6/µ. 5.4 T i Service ime for cusomer i Find: Pr(A is he las person leaving) Pr(T B + T C < T A ) p Service ime T i 1 min: p The disribuion for T i is Pr(T i ) { 1 3 1,2,3 ellers 2. okober 28 Side 1

p Pr(T B + T C < T A ) Pr(T B + T C 2, T A 3) Pr(T B T C 1, T A 3) 1 3 1 3 1 3 1 27 c) T i exp(µ) iid. Alernaively p Pr(T B + T C < T A ) Pr(T B + T C < T A, T B < T A ) Pr(T B + T C < T A T B < T A ) Pr(T B < T A ) Pr(T C < T A ) Pr(T B < T A ) (T A memoryless) 1 2 1 2 1 4 p Pr(C leaves before A, B leaves before A) Pr (C leaves before A B leaves before A) Pr (B leaves before A) }{{}}{{} exp disr memoryless + symmerical symmerical 5.6 p Pr(Smih is NOT las) Pr(Jones is las) + Pr(Brown is las) Pr(T B < T J ) Pr(T S < T J ) + Pr(T J < T B ) Pr(T S < T B ) (see 5.2) λ 2 λ 2 λ 1 + λ 1 λ 1 + λ 2 λ 1 + λ 2 λ 1 + λ 2 λ 1 + λ 2 ( ) 2 ( ) 2 λ2 λ1 + λ 1 + λ 2 λ 1 + λ 2 5.1 See he book a page 748. (Page 662 in 8 h ediion). 5.14 X exp(λ) 2. okober 28 Side 2

We need: i) P(X < c) 1 e λc ii) f X X<c (x) Now we ge he definiion, see page 97 { fx (x) < x < c oherwise E[X X < c] xf X X<c (x)dx 1 c xλe λx dx c 1 [ xe λx 1 λ e λx ] c 1 [ 1 λ (1 e λc ) ce λc ] x f X(x) dx (1 e λc ) 1 λ c 1 We have E[X X > c] c + E[X] c + 1 λ Therefore we ge he ideniy Exercises from exams E[X X < c] 1 [E[X] (1 ) E[X X > c]] 1 [ 1 λ (1 )(c + 1 ] λ ) 1 λ c 1 Augus 4, Oppg. 2 Le X be he number of ype A errors and Y be he number of ype B errors, where X is Poisson disribued wih inensiy λ 1 and Y is Poisson disribued wih inensiy λ 2. 2. okober 28 Side 3

We wan o show ha X + Y is Poisson disribued wih inensiy λ 1 + λ 2. P(X + Y n) P(X k,y n k) k P(X k)p(y n k) k k λ k 1 e λ 1 k! e (λ 1+λ 2 ) λ n k 2 e λ 2 (n k)! k (2) e (λ 1+λ 2 ) (λ 1 + λ 2 ) n k!(n k)! λk 1λ2 n k We recognize his as a Poisson disribuion wih inensiy λ 1 + λ 2. The ransiion dendoed by is due o he fac ha X and Y are independen. The ransiion denoed by (2) is due o he fac ha he Binomial expansion of (λ 1 + λ 2 ) n is (λ 1 + λ 2 ) n k k!(n k)! λk 1 λn k 2 A naural objecion agains modeling he number of errors on he componen as a Poisson process wih consan inensiy is ha he probabiliy of an error will increase wih increasing ime in mos cases. We le Z X + Y. Then X is he number of errors of ype A, Y is he number of errors of ype B, and Z is he oal number of errors. We wan o find P(X 1 Z 1) (2) P(X 1,Z 1) P(Z 1) P(X 1)P(Y ) P(Z 1) P(X 1,Y ) P(Z 1) λ 1 e λ 1 e λ 2 (λ 1 + λ 2 )e (λ 1+λ 2 ) λ 1 λ 1 + λ 2 is due o ha fac he if X 1 and Z 1, hen Y. (2) is due o he fac ha X and Y are independen. (X and Z are no independen.) c) We le X(u) be he number of errors in he inerval (,u]. Similarly, X() is he number of errors in he inerval (,], wih u. Given ha X() n, we wan o find he disribuion of X(u). 2. okober 28 Side 4

P(X(u) k X() n) P(X(u) k,x() n) P(X() n) P(X(u) k,x() X(u) n k) P(X() n) e λu (λu) k k! k!(n k)! e λ( u) (λ( u)) n k (n k)! e λ (λ) n (u ) k ( 1 u ) n k, which we recognize as a Binomial disribuion wih parameer u/. Commen o : The number of errors on he inerval (, u] is no independen of he number of errors on he inerval (,]. The number of errors on he inerval (u,] is, hough, independen of he number of errors on he inerval (,u]. We wan o find he probabiliy of exacly k(< n) errors in he inerval (u,] given ha here occurs n errors in he inerval (,]. P(X() X(u) k X() n) P(X(u) n k X() n) (2) (n k)! (u ) n k ( u) n, 1 which we recognize as a Binomial disribuion wih parameer 1 u/. Commen o (2): We use k n k in he disribuion derived above. 2. okober 28 Side 5