Markov Repairable Systems with History-Dependent Up and Down States
|
|
- Clement Smith
- 5 years ago
- Views:
Transcription
1 Markov Repairable Systems with History-Dependent Up and Down States Lirong Cui School of Management & Economics Beijing Institute of Technology Beijing 0008, P.R. China Haijun Li Department of Mathematics Washington State University Pullman, WA 9964, U.S.A. Jinlin Li School of Management & Economics Beijing Institute of Technology Beijing 0008, P.R. China December 2006 Supported by the NSF of China Grant
2 Abstract This paper introduces a Markov model for a multi-state repairable system in which some states are changeable in the sense that whether those physical states are up or down depends on the immediately preceding state of the system evolution process. Several reliability indexes, such as availability, mean up time, steady-state reliability, are calculated using the matrix method. A sufficient condition under which the availabilities of the stochastically monotone repairable systems with history-dependent states can be compared is also obtained. Some examples are presented to illustrate the results in the paper. Mathematics Subject Classification: 60K0. Key words and phrases: Markov repairable system, history-dependent state, multi-state maintenance model, availability, reliability, up time, down time.
3 Introduction A repairable system operating in a random environment is subjected to degradation, failure and various kinds of repairs. In many cases, some physical states of such a system can be classified as operational, or up states, whereas others are classified as failure, or down states, depending on the physical conditions of the system. However, in certain situations, some system states can be up or down, and their status of being up or down is affected by the recent system evolution history. These situations can occur, for example, in an energy supply system supporting an equipment which produces some energy for its own use when operating. The focus of this paper is on a Markov model for a multi-state repairable system with such history-dependent states. Consider a Markov repairable system that has 8 states, say states, 2,, 4, 5, 6, 7 and 8. The states, 2,, and 4 are functional or up states, and 7 and 8 are failure or down states. The states 5, and 6 are history-dependent. The system evolution follows a continuous-time Markov chain which starts in. As time goes by, the system changes its states due to wear, damage, regular maintenance and repairs. When the system moves into states 5 and 6 from the up states, 5 and 6 are also up states. When the system is in one of the down states, major repairs are needed to restore the system back to the up states. But with possible imperfect repairs, the system may move into states 5 and 6 from the down states, and when this happens, the system is still in the down mode. A repair action on a failed system has to be significant so that the system is brought back to one of the up states, 2,, and 4 and then works again. For this multi-state repairable system, whether states 5 and 6 are up or down depends on the type of the last state from which the system exits before entering states 5 and 6. Figure shows a possible system evolution path and the corresponding up and down durations of the system. In general, the system dynamics of a continuously-monitored, multi-state repairable system can be described by a homogeneous, continuous-time Markov chain {X(t), t 0}, where X(t) is the system state at time t. The state space S is finite, and can be partitioned into three sets, U, C, and D. The states in U, also called type U states, are the up states of the system, and the states in D, also called type D states, are the down states of the system. The states in C, called type C or changeable states, are history-dependent, in the sense that any type C state is an up state if the last non-type C state which the chain visits prior to moving into it is of type U, and a down state otherwise. As in our illustrative example above, we equip the state space S = U C D with a partial order on U, C, and D, such that
4 State... State 2 State State 4 State 5 State 6 State 7 State 8 up... down up period down period... time t Figure : A sample path of the Markov repairable system. if state a U and state b C D are comparable, then a b, and 2. if state a C and state b D are comparable, then a b. Such a multi-state Markov repairable system has a natural interpretation that the changeable set C serves as a set of boundary states between up and down states, and the system in a type C state may need some preventive maintenance actions or minor repairs to get back to a good operational condition, whereas in a down state, major repairs are needed to restore the system to be operational again. In this paper, we use matrix methods to present a steady-state analysis for the system, and obtain explicit expressions for several system reliability indexes, such as availability, steady-state reliability, mean up and down times. We also compare availabilities for multi-state stochastically monotone repairable systems (see the definition in Section 2) with different model parameters, and our results reveal some structural insights of the system dynamics with history-dependent states. Markov repairable systems with fixed functional and failure states have been studied in Shaked and Shanthikumar (990), and Sumita, Shanthikumar and Masuda (987), and the reference therein. However, the system with history-dependent states has not been studied in the reliability literature, and such a system provides a realistic model for multistate repairable systems with actual implementations of various maintenance policies. By enlarging the state space, a system with history-dependent states can be converted into a system with fixed up and down states. But this approach adds more computational burdens. 2
5 Instead, our method to derive computational formulas is more similar to the matrix method employed by Colquhoun and Hawkes (982, 990), Jalali and Hawkes (992a, 992b) for studies on ion channels. This matrix method is powerful, and allows us to obtain efficient formulas of all the popular reliability indexes for the system performance. The paper is organized as follows. Section 2 discusses the system availability. Section studies the system up and down times and steady-state reliability. Section 4 presents a numerical example to illustrate the results obtained in the paper. Finally, some remarks in Section 5 conclude the paper. Throughout this paper, the terms increasing and decreasing mean non-decreasing and non-increasing respectively. A function on S is often written as a column vector of S -dimension, whereas a probability mass function on S is always written as a row vector of S -dimension. All the matrices and vectors are written in bold face, and in any product of matrices and/or vectors the terms have appropriate dimensions. 2 System Availability and Its Comparisons In this section, we first obtain a formula for the availability of a Markov repairable system with history-dependent states, and then compare the availabilities for the repairable systems with different model parameters. Our result shows that the availability of a stochastically monotone repairable system operating in a harsher environment is smaller. Consider the Markov chain {X(t), t 0} introduced in Section. Let n be the number of states in the state space S, which has a minimal state i min. First, we define an n-dimensional (row) vector, p(t) = (p j (t), j S), with elements given by p j (t) = P (system is in state j at time t) = P (X(t) = j), j S. (2.) It is well-known that the probability vector p(t) satisfies dp(t) dt = p(t)q, where n n matrix Q = (q ij ) is the infinitesimal generator of the chain. Assume the system is new at time t = 0, and thus the Markov chain starts in i min almost surely. Let p 0 be the initial probability vector of the process. We also define an n n matrix, P(t) = (P ij (t)), with elements given by P ij (t) = P (X(t) = j X(0) = i), i, j S. (2.2)
6 It is also a standard result that dp(t) dt = P(t)Q, with P(0) = I. Hereafter, I denotes an identity matrix with appropriate size. Thus P(t) = exp(qt) and p(t) = p 0 exp(qt) for any t 0. Let φ(s) denote the Laplace transform of a function φ(t), then the Laplace transforms of p(t) and P(t) are given by, respectively, p(s) = p 0 (si Q), P(s) = (si Q). (2.) To calculate explicitly the reliability indexes, it is crucial to obtain the probability that the repairable system {X(t), t 0} remains within a specific set of states from 0 up to time t. Let state space S = A B, with A B =. The generator Q can now be partitioned according to A and B as follows, Q = ( Q AA Q BA Q AB Q BB Consider an A A matrix P AA (t) = ( A P ij (t)), where ). (2.4) A P ij (t) = P (X(t) = j, X(s) A for all 0 s t X(0) = i), i, j A. (2.5) It is known (see, for example, Colquhoun and Hawkes 982) that dp AA (t) dt = P AA (t)q AA, with the initial condition P AA (0) = I. Thus, P AA (t) = exp(q AA t) and its Laplace transform is given by P AA (s) = (si Q AA ). Since the state space is finite, we can use the uniformization to obtain an alternative representation for p(t). Choose a real number q < for the generator Q such that Define the stochastic matrix q max i S { q ii}. A q = I + Q, (2.6) q and A q is known as the probability transition matrix of the embedded, discrete-time Markov chain of {X(t), t 0}. Substituting Q = q(i A q ) in p(t) = p 0 exp(qt), we obtain that [ ] ) p(t) = p 0 (exp( qt) (qt)k )A k q = (exp( qt) (qt)k p 0 A k k! k! q. (2.7) k=0 4 k=0
7 This representation of {X(t), t 0} is known as the discrete-time Markov chain subordinated to a Poisson process. The instantaneous availability A(t) of a system at time t is the probability that the system is functioning at time t, and its limit A( ) = lim t A(t), if exists, is called the steady-state system availability. Theorem 2.. For a Markov repairable system with history-dependent states, the instantaneous availability is given by A(t) = k U p k (t) + i U t q ij p i (u) C P jl (t u)du, j C l C and its steady-state availability is given by [ A( ) = lim s p k (s) + s ] q ij p i (s) C Pjl (s). s 0 i U j C k U Proof. The first term in the expression for A(t) is the probability that the system stays within U at time t, and the second term corresponds to the probability that after having stayed in U up to time u, the system moves into C and stays there from u to t. The system availability is the sum of these two terms. The steady-state availability then follows from a Tauberian theorem (Theorem 4., Widder 946) and the convolution formula for the Laplace transform. Theorem 2. can be used to calculate the system availability (see Section 4), but offers little insight on how the availability would change in response to a change of system parameters. To understand this, we utilize the stochastic comparison method. Let X and Y be two S-valued random variables with n dimensional probability mass (row) vectors p and q respectively. Let e A denote the indicator function (n dimensional column vector) of subset A of S; that is, e A (i) = if i A S, and zero otherwise. X is said to be larger than Y in the usual stochastic order (denoted by X st Y or p st q) if 0 l C pe A qe A (2.8) for all upper sets A S (A subset A S is called upper if i A and i j imply that j A.). It is easy to verify that X st Y if and only if Eφ(X) = pφ qφ = Eφ(Y ) for all real increasing functions (n dimensional column vectors) φ : S R. The stochastic comparison of Markov chains involves the following notions (see Massey 987, Li and Shaked 994). 5
8 Definition 2.2. Let {X(t), t 0} and {X (t), t 0} be two Markov chains with generators Q and Q respectively.. The Markov chain {X(t), t 0} is called stochastically monotone if there exists a q such that for any upper set A S, the function (n dimensional column vector) A q e A : S R is increasing, where A q is the stochastic matrix defined by (2.6). 2. We say Q st Q if Qe A Q e A component-wise for any upper set A S. It is straightforward to verify from (2.6) that for any upper subset A S, if A q e A is increasing, then A q e A is also increasing for any q q. Because of (2.6), loosely speaking, a stochastically monotone Markov chain is more likely to move higher from a larger state, whereas Q st Q means that {X (t), t 0} is more likely to move higher from any state than {X(t), t 0} does. The following result, due to Whitt (986) and Massey (987), describes the stochastic comparison of Markov chains. Theorem 2.. Let {X(t), t 0} and {X (t), t 0} be two Markov chains with the same finite state space S and same initial probability vector p 0, but different generators Q and Q respectively. If. one of the processes is stochastically monotone, and 2. Q st Q, then X(t) st X (t) for any time t 0. Using this result, we can obtain the availability comparison of two Markov repairable systems. Let {X(t), t 0} and {X (t), t 0} be two Markov repairable systems with history-dependent states, as described in Section. Let A(t) and A (t) (A( ) and A ( )) denote the (steady-state) availabilities of X(t) and X (t) respectively. Theorem 2.4. Suppose that two systems have the same finite state space S and same initial probability vector p 0, but different generators Q = (q ij ) and Q = (q ij) respectively. If. one of the processes is stochastically monotone, and 2. Q st Q, then A(t) A (t) for any time t 0, and also A( ) A ( ). 6
9 Proof. The idea is to convert the repairable system with history-dependent states into a repairable system with fixed up and down states, so that the availability has a simpler expression. Let S = S C be the new state space, where C contains the same number of states as C such that for any i C, there is correspondingly a unique ī C. Extend the partial order in S to S as follows.. Retain all the ordering relations among the states in S = U C D. 2. For any ī C and j D, if i j in S, then define ī j in S.. For any i U and j C, if i j in S, then define i j in S. 4. For any ī, j C, if i j in S, then define ī j and i j in S. Note that for any state i C, i ī in S, but no state in C is larger than any state in C. It is straightforward to verify that S is a partially ordered space with this extension of. Let {Y (t), t 0} and {Y (t), t 0} be two Markov chains with state space S, starting at i min almost surely. The generator Q = ( q ij ) ( Q = ( q ij)) of {Y (t)} ({Y (t)}) is defined as follows. q ij (q ij) if i U C D, and j U D q ij (q ij) if i U C, and j C q ij ( q ij) q kj (q kj = ) if i = k C, and j U D q kl (q kl ) if i = k C, and j = l C (2.9) q ik (q ik ) if i D, and j = k C 0 (0) otherwise. Note that the new Markov chains cannot move directly from U C to C, nor from D C to C (Figure 2). Obviously A(t) = P (Y (t) U C) and A (t) = P (Y (t) U C). In order to use Theorem 2. to compare the processes, we need to show that the sufficient conditions in Theorem 2. hold for {Y (t), t 0} and {Y (t), t 0}. We only prove the case where {X(t), t 0} is stochastically monotone, and the other case is similar. Let A q = (a ij ) and Āq = (ā ij ) be the stochastic matrices for X(t) and Y (t), respectively, defined as in (2.6). 7
10 D Higher C C U Lower Figure 2: New Markov chain From (2.9), we have, ā ij = a ij a ij a kj a kl a ik if i U C D, and j U D if i U C, and j C if i = k C, and j U D if i = k C, and j = l C if i D, and j = k C 0 otherwise. (2.0) The monotonicity of {X(t), t 0} implies that A q e A is increasing for any upper set A S. For any upper subset A S, let B = A (U C D) = (A U) (A C) (A D), B = (A U) {i C : ī A C} (A D). Since A is upper, A S is an upper subset of S, and also B is an upper subset of U C D. We claim that B is an upper subset of S. For this, consider i B, and i j for j S. We need to show that j B.. Suppose that i A (U D). Because A is upper, j A and j A. 8
11 (a) If j U D, then j A (U D) B. (b) If j C, then j A C, which implies that j {i C : ī A C} B. 2. Suppose that i C such that ī A C. (a) If j C, then ī j and so j A. j {i C : ī A C} B. (b) If j D, then ī j and so j A. Thus j A D B. Therefore, in any case, j B. Indeed, B is upper. Thus j A C, which implies that We also claim that A S B. In fact, for any j A C, we have j j C. Since A is upper, j A. Thus, j A C, which implies that j {i C : ī A C}. Hence A C {i C : ī A C}, and A S B. We first show that {Y (t), t 0} is stochastically monotone. We need to verify that Āqe A is increasing for any given upper subset A S. Let a i (ā i ) denote the ith row vector of A q (Āq). From our construction, there are three cases.. For any i U C, since {Y (t), t 0} cannot move directly from i into C (see (2.0)), ā i e A = a i e A S. Hence ā i e A is increasing in i within U C. 2. For any i D C, {Y (t), t 0} cannot move directly from i into C (2.0). (a) If i D, then ā i e A = ā i e B = a i e B. Hence ā i e A is increasing in i within D. (b) If i = k C, then ā i e A = ā i e B = a k e B. Hence ā i e A is increasing in i within C. (c) Consider i = k C and j D with i j. From our construction, we have k j, and thus, ā i e A = ā i e B = a k e B a j e B = ā j e B = ā j e A. Therefore, ā i e A is increasing in i within D C.. Consider any i U C, and j D C with i j. (a) If j D, then ā i e A = a i e A S a j e A S a j e B = ā j e A. (b) If j C, then j = k for some k C. From our construction of the partial ordering, i k is equivalent to i k. Thus, ā i e A = a i e A S a k e A S a k e B = ā j e B = ā j e A. 9
12 Therefore, ā i e A is increasing in i for any upper subset A S. Next, we show that Q Q st. Let q i ( q i ) denote the ith row vector of Q ( Q), and q i ( q i) denote the ith row vector of Q ( Q ). We now verify that for any upper set A S, q i e A q ie A for any i S.. For any i U C, 2. For any i D, q i e A = q i e A S q ie A S = q ie A. q i e A = q i e B = q i e B q ie B = q ie B = q ie A.. For any i = k C, q i e A = q i e B = q k e B q ke B = q ie B = q ie A. Thus Qe A Q e A component-wise for any upper set A. From Theorem 2., we obtain that Y (t) st Y (t) for any t. Since D C is an upper subset in S, then P (Y (t) D C) P (Y (t) D C), and hence A(t) A (t) for any t. Taking limit leads to A( ) A ( ). Our interpretation for Q st Q is that the system X (t) receives severer wear and damage than X(t) does, and thus, the availability of X (t) is smaller. The technique used in Theorem 2.4 can also be used for the availability calculation. Since the new state space S has fixed up and down states, then A(t) = P (Y (t) U C) = p 0 exp( Qt)e U C. (2.) However, enlarging state space may add extra burden on computation, especially for a large class of history-dependent states. Example 2.5. Consider a continuous-time Markov repairable system {X(t), t 0} with state space S = {, 2, }, where is the up state, is the down state, and 2 is changeable. The state space is equipped with the natural ordering. The Markov chain starts in, and has the following generator. 2 Q =
13 Let q =, then A q = Thus {X(t), t 0} is stochastically monotone.. Let {X (t), t 0} be another Markov repairable system with the same state space and same initial probability vector, and the following generator. 2 Q 2 = It is easy to verify that Q st Q, and hence A(t) A (t) for any t. To calculate the availability using (2.), we enlarge the state space and let S = {, 2, 2, } with the ordering that 2 2. The corresponding generators on this enlarged state space are given by (see (2.9)), Q = 0 2, and Q = It follows from (2.) that the availability of X(t) is given by A(t) = (, 0, 0, 0) exp( Qt)e {,2}, and the availability of X (t) is given by A (t) = (, 0, 0, 0) exp( Q t)e {,2}. Steady-State Analysis of System Up and Down Times To calculate the distributions of up and down times for a Markov repairable system {X(t), t 0} with history-dependent states, a useful quantity is the probability that the system stays within a subset A of state space S up to time t and then exits from A to a state outside. Let state space S = A B. Consider, gij AB (t) = lim P (X(s) A for all 0 s t, X(t + t) = j X(0) = i), i A, j B. t 0 t It follows from Colquhoun and Hawkes (982) that g AB ij (t) = r A A P ir (t)q rj, i A, j B, (.)
14 where A P ir (t) is defined by (2.5). In the matrix form, we obtain an A B matrix, G AB (t) = (g AB ij (t)) = P AA (t)q AB, where Q AB is given in the partitioned matrix (2.4). Its Laplace transform is given by G AB (s) = (si Q AA ) Q AB. In particular, G AB (0) = ( g ij AB (0)) = Q AA Q AB, where Let g AB ij (0) = 0 g AB ij (u)du = P (system exits from A to j X(0) = i), i A, j B. G AB = Q AA Q AB (.2) denote the matrix of exit probabilities. Note that gij AB (t), i A, j B, is not a proper probability density function. We now partition the generator of Markov chain {X(t), t 0} according to the types of states we defined in Section. Let Q UU Q UC Q UD Q = Q CU Q CC Q CD. (.) Q DU Q DC Q DD Suppose that the system is in the steady-state. An up duration for the repairable system begins when a transition from D directly to U (D U), or from D, via C, to U (D C U) occurs. The rate that the system goes into the up state j is given by i ( )(q ij + i D[p q ik g kj CU (0))], j U, k C where p i ( ) denotes the equilibrium probability of the system in state i. Let u U = (u j, j U) be the vector of probability masses that the system goes into the up states, then u U = p D( )(Q DU + Q DC G CU ) p D ( )(Q DU + Q DC G CU )e (.4) where p D ( ) = (p i ( ), i D), and hereafter, e denotes the vector of s with an appropriate dimension. Theorem.. Let {X(t), t 0} be a Markov repairable system with history-dependent states, as described in Section. The probability density function of the system lifetime in the steady-state is given by f up (t) = (u U, 0)G W D (t)e, where W = U C. The mean up time for the repairable system in the steady-state is m up = (u U, 0)Q 2 W W Q W De. 2
15 Proof. The expression of f up follows from the fact that the system up time starts at the transition D U or D C U, and ends when a transition from W to D occurs. Since the exit from W is certain, we have 0 (u U, 0)G W D (t)e dt = P ( system exits from W system starts at an up state ) =. Thus, (u U, 0)G W D (t)e is a proper density function. Since G W D (t) = P W W (t)q W D, its Laplace transform is given by G W D (s) = (si Q W W ) Q W D. Thus, f up (s) = (u U, 0)(sI Q W W ) Q W D e. This implies that ( d m up = f ) up (s) = (u U, 0)Q 2 W W ds Q W De. s=0 The steady-state reliability of the system can be also calculated as follows, R (t) = t f up (u)du = (u U, 0) exp(q W W t)q W W Q W De. Similarly, the distribution of the system down time in the steady-state can be also obtained. Theorem.2. Let {X(t), t 0} be a Markov repairable system with history-dependent states, as described in Section. The probability density function of the system downtime in the steady-state is given by where F = D C, and f down (t) = (u D, 0)G F U (t)e, u D = p U( )(Q UD + Q UC G CD ) p U ( )(Q UD + Q UC G CD )e is the vector of probabilities that the repairable system goes into the down states. The mean down time for the repairable system in the steady-state is m down = (u D, 0)Q 2 F F Q F Ue. Note that since the exit from F is certain, f down (t) is a proper probability density function. As we mentioned before, the system in a type U state operates in a good condition, whereas the system in a type D state needs a major repair to get back to working condition. The system in a type C state may be functional in some cases, but may need some minor repairs to restore it to a good operational condition. It is then of interest to calculate the total time that the system operates in a good condition per major repair cycle; that is, the
16 total time that the system spends in U within the set of up states before moving to a down state. Let {X(t), t 0} be a Markov repairable system with history-dependent states, as described in Section. Let f good (t) be the probability density function of the total time that the system, in the steady-state, spends in U per major repair cycle. To calculate f good (t), we use the method employed in Colquhoun and Hawkes (982). First, we observe that the Laplace transform of the sojourn time that the system last visits U before moving to a down state is given by [ G UC (s)g CD + G UD (s)]e D. Given that there are k visits to U before going to a down state, the sum of the first k occupation times in U has the following Laplace transform, u U [ G UC (s)g CU ] k, where u U is the vector of probabilities that the system goes into an up state in the steadystate. Thus, the Laplace transform of f good (t) is given by, f good (s) = u U [ G UC (s)g CU ] k [ G UC (s)g CD + G UD (s)]e D k= = u U [I G UC (s)g CU ] [ G UC (s)g CD + G UD (s)]e D. Since G UC (s) = (si Q UU ) Q UC and G UD (s) = (si Q UU ) Q UD, we have f good (s) = u U [I (si Q UU ) Q UC G CU ] (si Q UU ) [Q UC G CD + Q UD ]e D = u U {(si Q UU )[I (si Q UU ) Q UC G CU ]} [Q UC G CD + Q UD ]e D = u U [si Q UU Q UC G CU ] [Q UC G CD + Q UD ]e D. Noticing that the Laplace transform of exp[(q UU + Q UC G CU )t] is [si Q UU Q UC G CU ], we then obtain the density function f good. This and related results are summarized in the following theorem. Theorem.. Consider a Markov repairable system with history-dependent states.. The probability density function of the total time that the system operates in a good condition (that is, in type U states) per major repair cycle is given by f good (t) = u U (exp[(q UU + Q UC G CU )t]) [Q UC G CD + Q UD ]e D. (.5) 4
17 The mean time m good that the system operates in a good condition per major repair cycle is given by m good = u U (Q UU + Q UC G CU ) 2 [Q UC G CD + Q UD ]e D. (.6) 2. The probability density function of the total time that the system stays in an overhaul mode (that is, in type D states) per down cycle is given by f bad (t) = u D (exp[(q DD + Q DC G CD )t]) [Q DC G CU + Q DU ]e U. (.7) The mean time m bad that the system stays in an overhaul mode per down cycle is given by m bad = u D (Q DD + Q DC G CD ) 2 [Q DC G CU + Q DU ]e U. (.8) In fact, our Markov repairable system with history-dependent states is similar to the stochastic model of ion channel developed by Colquhoun and Hawkes (982). It is thus not surprising that the matrix method of Colquhoun and Hawkes (982) can be used to calculate the reliability indexes of Markov repairable systems. Note, however, that the difference of our model and the ion channel model of Colquhoun and Hawkes (982) is that some states in our repairable system are of changeable type. 4 A Numerical Example In this section, we present an example to illustrate the results we obtained in the previous sections. Consider a repairable system {X(t), t 0} with 6 states, S = {, 2,, 4, 5, 6}. Let U = {, 2}, C = {, 4}, D = {5, 6}. Its generator is partitioned as follows. Q UU Q UC Q UD Q = Q CU Q CC Q CD = Q DU Q DC Q DD By solving p( )Q = 0 with p( )e =, we obtain that p ( ) = , p 2( ) = , p 5( ) = , p 6( ) =
18 On the other hand, G CU = Q CC Q CU = u U = p D( )(Q DU + Q DC G CU ) p D ( )(Q DU + Q DC G CU )e ( ), = (0.258, ). After some matrix manipulations, we obtain, by using Maple software, that f up (t) = exp(.6294t) exp( 2.546t) exp( t) exp( t). The mean up time for the repairable system is m up = Similarly, we obtain that u D = p U( )(Q UD + Q UC G CD ) p U ( )(Q UD + Q UC G CD )e = (0.6445, ), f down (t) = exp( t) exp( 2.60t) exp(.2752t) cos(0.6549t) exp(.2752t) sin(0.6549t). The mean downtime for the repairable system is m down = After taking the Laplace transform and the inverse transform, we obtain that A(t) = p (t) + p 2 (t) t q ij p i (u) C P jl (t u)du, i= j= l= 0 where p (t) = exp( 4.272t) exp(.2422t) exp(.720t) cos(0.6996t) exp(.720t) sin(0.6996t) exp( t), p 2 (t) = exp( 4.272t) exp(.2422t) exp(.720t) cos(0.6996t) exp(.720t) sin(0.6996t) exp( t), q = 4, q 4 =, q 2 = 2, q 24 = 2, 6
19 C P (t) = 4 4 exp( 2 2 t) 4 sinh( 2 C P 4 (t) = exp( C P 4 (t) = exp( C P 44 (t) = 4 exp( 2 2 t) 4 sinh( t) + exp( 2t) cosh( t) exp( t), t) exp( t), 2 4 t) + exp( 2t) cosh( t), 4 2 t). Here, sinh(x) and cosh(x) are the hyperbolic functions. The curve of the availability A(t) is given in the top graph of Figure. The segment of the availability curve where t.8 is magnified in the bottom graph of Figure. It is easy to see the availability is stabilized and the steady-state availability A( ) Conclusions In this paper, we introduce a new Markov maintenance model to study the situation where a repairable system experiences certain modes in which the system behavior depends on the recent system evolution history. Such a system provides a realistic, tractable model for multi-state repairable systems with actual implementations of various maintenance policies. We employ the matrix method developed in the ion channel theory to calculate the system reliability measures, such as availability, and the distributions of up and down times. We also develop a stochastic comparison method to compare the availabilities of two systems with different parameters, and show that the availability of a monotone repairable system is reduced in a tougher operating environment. The stochastic availability comparison, such as the one presented in this paper, examines how the system availability varies in response to an environmental change. Such studies, to the best of our knowledge, have not appeared in the reliability literature. 7
20 Availability A (t) Availability A(t) Figure : Availability Curves: The top graph shows the availability curve, and the bottom graph details the segment of the curve from time instants.8 to. 8
21 References [] Colquhoun, D. and Hawkes, A.G. (982). On the stochastic properties of bursts of single ion channel opening and of clusters of bursts. Phil. Trans. R. Soc. London B 00, -59. [2] Colquhoun, D. and Hawkes, A.G. (990). Stochastic properties of ion channel openings and bursts in a membrane patch that contains two channels: evidence concerning the number of channels present when a record containing only single openings is observed. Proc. R. Soc. London B 240, [] Jalali, A. and Hawkes, A. G. (992a). The distribution of apparent occupancy times in a two-state Markov process in which brief events can not be detected. Adv. Appl. Prob. 24, [4] Jalali, A. and Hawkes, A. G. (992b). Generalized eigenproblems arising in aggregated Markov processes allowing for time interval omission. Adv. Appl. Prob. 24, [5] Li, H. and Shaked, M. (994). Stochastic convexity and concavity of Markov processes. Mathematics of Operations Research, 9, [6] Massey, W. (987). Stochastic orderings for Markov processes on partially ordered spaces. Mathematics of Operations Research, 2, [7] Shaked, M. and Shanthikumar, J. G. (990). Reliability and maintainability. Handbook in Operations Research & Management Sciences, Ch., Vol. 2, D.P. Heyman and M.J. Sobel, Eds., Elsevier Science Publishers (North-Holland). [8] Sumita, U., Shanthikumar, J. G. and Masuda, Y. (987). Analysis of fault tolerant computer systems. Microelectronics and Reliability 27, [9] Whitt, W. (986). Stochastic comparisons for non-markov processes. Mathematics of Operations Research,, [0] Widder, D. V. (946). The Laplace Transform. Princeton University Press. 9
Coherent Systems of Components with Multivariate Phase Type Life Distributions
!#"%$ & ' ")( * +!-,#. /10 24353768:9 ;=A@CBD@CEGF4HJI?HKFL@CM H < N OPc_dHe@ F]IfR@ ZgWhNe@ iqwjhkf]bjwlkyaf]w
More informationMarkov Reliability and Availability Analysis. Markov Processes
Markov Reliability and Availability Analysis Firma convenzione Politecnico Part II: Continuous di Milano e Time Veneranda Discrete Fabbrica State del Duomo di Milano Markov Processes Aula Magna Rettorato
More informationIMPORTANCE MEASURES FOR MULTICOMPONENT BINARY SYSTEMS
Dept. of Math. University of Oslo Statistical Research Report No. 11 ISSN 86 3842 December 24 IMPORTANCE MEASURES FOR MULTICOMPONENT INARY SYSTEMS Arne ang Huseby Abstract In this paper we review the theory
More informationStatistics 992 Continuous-time Markov Chains Spring 2004
Summary Continuous-time finite-state-space Markov chains are stochastic processes that are widely used to model the process of nucleotide substitution. This chapter aims to present much of the mathematics
More informationIrreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1
Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate
More informationMULTIVARIATE DISCRETE PHASE-TYPE DISTRIBUTIONS
MULTIVARIATE DISCRETE PHASE-TYPE DISTRIBUTIONS By MATTHEW GOFF A dissertation submitted in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY WASHINGTON STATE UNIVERSITY Department
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationFAULT TOLERANT SYSTEMS
ﻋﻨﻮان درس ﻧﺎم اﺳﺘﺎد 1394-95 ﻧﺎم درس ﻧﺎم رﺷﺘﻪ ﻧﺎم ﮔﺮاﯾﺶ ﻧﺎم ﻣﻮﻟﻒ ﻧﺎم ﮐﺎرﺷﻨﺎس درس FAULT TOLERANT SYSTEMS Part 8 RAID Systems Chapter 3 Information Redundancy RAID - Redundant Arrays of Inexpensive (Independent
More informationOPPORTUNISTIC MAINTENANCE FOR MULTI-COMPONENT SHOCK MODELS
!#"%$ & '("*) +,!.-#/ 021 354648729;:=< >@?A?CBEDGFIHKJMLONPFQLR S TELUJVFXWYJZTEJZR[W6\]BED S H_^`FILbadc6BEe?fBEJgWPJVF hmijbkrdl S BkmnWP^oN prqtsouwvgxyczpq {v~} {qƒ zgv prq ˆŠ @Œk Šs Ž Žw š œ š Ÿž
More informationConditional Tail Expectations for Multivariate Phase Type Distributions
Conditional Tail Expectations for Multivariate Phase Type Distributions Jun Cai Department of Statistics and Actuarial Science University of Waterloo Waterloo, ON N2L 3G1, Canada Telphone: 1-519-8884567,
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationLIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974
LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the
More informationAvailability. M(t) = 1 - e -mt
Availability Availability - A(t) the probability that the system is operating correctly and is available to perform its functions at the instant of time t More general concept than reliability: failure
More informationMarkov Chains in Continuous Time
Chapter 23 Markov Chains in Continuous Time Previously we looke at Markov chains, where the transitions betweenstatesoccurreatspecifietime- steps. That it, we mae time (a continuous variable) avance in
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More informationCDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical
CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one
More information2. Transience and Recurrence
Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationIEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1.
IEOR 46: Introduction to Operations Research: Stochastic Models Spring, Professor Whitt Class Lecture Notes: Tuesday, March. Continuous-Time Markov Chains, Ross Chapter 6 Problems for Discussion and Solutions.
More informationAn Overview of Methods for Applying Semi-Markov Processes in Biostatistics.
An Overview of Methods for Applying Semi-Markov Processes in Biostatistics. Charles J. Mode Department of Mathematics and Computer Science Drexel University Philadelphia, PA 19104 Overview of Topics. I.
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationCost-Benefit Analysis of a System of Non-identical Units under Preventive Maintenance and Replacement Subject to Priority for Operation
International Journal of Statistics and Systems ISSN 973-2675 Volume 11, Number 1 (216), pp. 37-46 esearch India ublications http://www.ripublication.com Cost-Benefit Analysis of a System of Non-identical
More informationFailure modeling and maintenance optimization for a railway line
Failure modeling and maintenance optimization for a railway line Per Hokstad *, Helge Langseth SINTEF Technology and Society, Department of Safety and Reliability, N-7465 Trondheim, Norway Bo H. Lindqvist,
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationMultivariate Risk Processes with Interacting Intensities
Multivariate Risk Processes with Interacting Intensities Nicole Bäuerle (joint work with Rudolf Grübel) Luminy, April 2010 Outline Multivariate pure birth processes Multivariate Risk Processes Fluid Limits
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.
IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas
More informationMARKOV MODEL WITH COSTS In Markov models we are often interested in cost calculations.
MARKOV MODEL WITH COSTS In Markov models we are often interested in cost calculations. inventory model: storage costs manpower planning model: salary costs machine reliability model: repair costs We will
More informationStochastic Modelling of a Computer System with Software Redundancy and Priority to Hardware Repair
Stochastic Modelling of a Computer System with Software Redundancy and Priority to Hardware Repair Abstract V.J. Munday* Department of Statistics, Ramjas College, University of Delhi, Delhi 110007 (India)
More informationMarkov processes and queueing networks
Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution
More informationOn the Convolution Order with Reliability Applications
Applied Mathematical Sciences, Vol. 3, 2009, no. 16, 767-778 On the Convolution Order with Reliability Applications A. Alzaid and M. Kayid King Saud University, College of Science Dept. of Statistics and
More informationChapter 8. Calculation of PFD using Markov
Chapter 8. Calculation of PFD using Markov Mary Ann Lundteigen Marvin Rausand RAMS Group Department of Mechanical and Industrial Engineering NTNU (Version 0.1) Lundteigen& Rausand Chapter 8.Calculation
More informationReliability Analysis of a Fuel Supply System in Automobile Engine
ISBN 978-93-84468-19-4 Proceedings of International Conference on Transportation and Civil Engineering (ICTCE'15) London, March 21-22, 2015, pp. 1-11 Reliability Analysis of a Fuel Supply System in Automobile
More information9. Reliability theory
Material based on original slides by Tuomas Tirronen ELEC-C720 Modeling and analysis of communication networks Contents Introduction Structural system models Reliability of structures of independent repairable
More informationAvailability and Reliability Analysis for Dependent System with Load-Sharing and Degradation Facility
International Journal of Systems Science and Applied Mathematics 2018; 3(1): 10-15 http://www.sciencepublishinggroup.com/j/ijssam doi: 10.11648/j.ijssam.20180301.12 ISSN: 2575-5838 (Print); ISSN: 2575-5803
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationDISPERSIVE FUNCTIONS AND STOCHASTIC ORDERS
APPLICATIONES MATHEMATICAE 24,4(1997), pp. 429 444 J. BARTOSZEWICZ(Wroc law) DISPERSIVE UNCTIONS AND STOCHASTIC ORDERS Abstract. eneralizations of the hazard functions are proposed and general hazard rate
More informationMatching via Majorization for Consistency of Product Quality
Matching via Majorization for Consistency of Product Quality Lirong Cui Dejing Kong Haijun Li Abstract A new matching method is introduced in this paper to match attributes of parts in order to ensure
More informationLecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More informationCHAPTER 9 AVAILABILITY DEMONSTRATION PLANS CONTENTS
Applied R&M Manual for Defence Systems Part D Supporting Theory CHAPTER 9 AVAILABILITY DEMONSTRATION PLANS CONTENTS 1 INTRODUCTION 2 2 CONCEPTS AND TERMINOLOGY 2 3 STATISTICAL TEST PLANNING 4 4 DEMONSTRATION
More informationCOST-BENEFIT ANALYSIS OF A SYSTEM OF NON- IDENTICAL UNITS UNDER PREVENTIVE MAINTENANCE AND REPLACEMENT
Journal of Reliability and Statistical Studies; ISSN (Print): 0974-8024, (Online): 2229-5666 Vol. 9, Issue 2 (2016): 17-27 COST-BENEFIT ANALYSIS OF A SYSTEM OF NON- IDENTICAL UNITS UNDER PREVENTIVE MAINTENANCE
More informationMarkov Processes Cont d. Kolmogorov Differential Equations
Markov Processes Cont d Kolmogorov Differential Equations The Kolmogorov Differential Equations characterize the transition functions {P ij (t)} of a Markov process. The time-dependent behavior of the
More informationRELATING TIME AND CUSTOMER AVERAGES FOR QUEUES USING FORWARD COUPLING FROM THE PAST
J. Appl. Prob. 45, 568 574 (28) Printed in England Applied Probability Trust 28 RELATING TIME AND CUSTOMER AVERAGES FOR QUEUES USING FORWARD COUPLING FROM THE PAST EROL A. PEKÖZ, Boston University SHELDON
More informationQuantitative Model Checking (QMC) - SS12
Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over
More informationStochastic Models. Edited by D.P. Heyman Bellcore. MJ. Sobel State University of New York at Stony Brook
Stochastic Models Edited by D.P. Heyman Bellcore MJ. Sobel State University of New York at Stony Brook 1990 NORTH-HOLLAND AMSTERDAM NEW YORK OXFORD TOKYO Contents Preface CHARTER 1 Point Processes R.F.
More informationAnalysis for Parallel Repairable System with Degradation Facility
American Journal of Mathematics and Statistics 212, 2(4): 7-74 DOI: 1.5923/j.ajms.21224.2 Analysis for Parallel Repairable System with Degradation Facility M. A. El-Damcese *, N. S. Temraz Department of
More information8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains
8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States
More informationLTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather
1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationAlternative Characterization of Ergodicity for Doubly Stochastic Chains
Alternative Characterization of Ergodicity for Doubly Stochastic Chains Behrouz Touri and Angelia Nedić Abstract In this paper we discuss the ergodicity of stochastic and doubly stochastic chains. We define
More information2 Theory. 2.1 State Space Representation S 2 S 1 S 3
In the following sections we develop the theory, illustrate the technique by applying it to a sample system, and validate the results using the method of enumeration. Notations: A-state functional (acceptable)
More informationEE 445 / 850: Final Examination
EE 445 / 850: Final Examination Date and Time: 3 Dec 0, PM Room: HLTH B6 Exam Duration: 3 hours One formula sheet permitted. - Covers chapters - 5 problems each carrying 0 marks - Must show all calculations
More informationBatch Arrival Queuing Models with Periodic Review
Batch Arrival Queuing Models with Periodic Review R. Sivaraman Ph.D. Research Scholar in Mathematics Sri Satya Sai University of Technology and Medical Sciences Bhopal, Madhya Pradesh National Awardee
More informationSTOCHASTIC MODELLING OF A COMPUTER SYSTEM WITH HARDWARE REDUNDANCY SUBJECT TO MAXIMUM REPAIR TIME
STOCHASTIC MODELLING OF A COMPUTER SYSTEM WITH HARDWARE REDUNDANCY SUBJECT TO MAXIMUM REPAIR TIME V.J. Munday* Department of Statistics, M.D. University, Rohtak-124001 (India) Email: vjmunday@rediffmail.com
More informationLECTURE #6 BIRTH-DEATH PROCESS
LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 1
MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationSelecting Efficient Correlated Equilibria Through Distributed Learning. Jason R. Marden
1 Selecting Efficient Correlated Equilibria Through Distributed Learning Jason R. Marden Abstract A learning rule is completely uncoupled if each player s behavior is conditioned only on his own realized
More informationANALYSIS FOR A PARALLEL REPAIRABLE SYSTEM WITH DIFFERENT FAILURE MODES
Journal of Reliability and Statistical Studies; ISSN (Print): 0974-8024, (Online):2229-5666, Vol. 5, Issue 1 (2012): 95-106 ANALYSIS FOR A PARALLEL REPAIRABLE SYSTEM WITH DIFFERENT FAILURE MODES M. A.
More informationStochastic Analysis of a Cold Standby System with Server Failure
International Journal of Mathematics and Statistics Invention (IJMSI) E-ISSN: 2321 4767 P-ISSN: 2321-4759 Volume 4 Issue 6 August. 2016 PP-18-22 Stochastic Analysis of a Cold Standby System with Server
More informationCensoring Technique in Studying Block-Structured Markov Chains
Censoring Technique in Studying Block-Structured Markov Chains Yiqiang Q. Zhao 1 Abstract: Markov chains with block-structured transition matrices find many applications in various areas. Such Markov chains
More informationMarkov Chains. As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006.
Markov Chains As part of Interdisciplinary Mathematical Modeling, By Warren Weckesser Copyright c 2006 1 Introduction A (finite) Markov chain is a process with a finite number of states (or outcomes, or
More informationAn Efficient Calculation Algorithm in Continuous-Time Markov Analysis Using Large-Scale Numerical Calculation
International Mathematical Forum, Vol. 7, 2012, no. 10, 455-467 An Efficient Calculation Algorithm in Continuous-Time Markov Analysis Using Large-Scale Numerical Calculation Nobuko Kosugi and Koichi Suyama
More informationAdam Caromicoli. Alan S. Willsky 1. Stanley B. Gershwin 2. Abstract
December 1987 LIDS-P-1727 MULTIPLE TIME SCALE ANALYSIS OF MANUFACTURING SYSTEMS Adam Caromicoli Alan S. Willsky 1 Stanley B. Gershwin 2 Abstract In this paper we use results on the aggregation of singularly
More informationReliability of Safety-Critical Systems Chapter 9. Average frequency of dangerous failures
Reliability of Safety-Critical Systems Chapter 9. Average frequency of dangerous failures Mary Ann Lundteigen and Marvin Rausand mary.a.lundteigen@ntnu.no &marvin.rausand@ntnu.no RAMS Group Department
More informationProbabilistic Evaluation of the Effect of Maintenance Parameters on Reliability and Cost
Probabilistic Evaluation of the Effect of Maintenance Parameters on Reliability and Cost Mohsen Ghavami Electrical and Computer Engineering Department Texas A&M University College Station, TX 77843-3128,
More informationStochastic Analysis of a Two-Unit Cold Standby System with Arbitrary Distributions for Life, Repair and Waiting Times
International Journal of Performability Engineering Vol. 11, No. 3, May 2015, pp. 293-299. RAMS Consultants Printed in India Stochastic Analysis of a Two-Unit Cold Standby System with Arbitrary Distributions
More informationANALYTICAL MODEL OF A VIRTUAL BACKBONE STABILITY IN MOBILE ENVIRONMENT
(The 4th New York Metro Area Networking Workshop, New York City, Sept. 2004) ANALYTICAL MODEL OF A VIRTUAL BACKBONE STABILITY IN MOBILE ENVIRONMENT Ibrahim Hökelek 1, Mariusz A. Fecko 2, M. Ümit Uyar 1
More informationRELIABILITY ANALYSIS OF A FUEL SUPPLY SYSTEM IN AN AUTOMOBILE ENGINE
International J. of Math. Sci. & Engg. Appls. (IJMSEA) ISSN 973-9424, Vol. 9 No. III (September, 215), pp. 125-139 RELIABILITY ANALYSIS OF A FUEL SUPPLY SYSTEM IN AN AUTOMOBILE ENGINE R. K. AGNIHOTRI 1,
More informationContagious default: application of methods of Statistical Mechanics in Finance
Contagious default: application of methods of Statistical Mechanics in Finance Wolfgang J. Runggaldier University of Padova, Italy www.math.unipd.it/runggaldier based on joint work with : Paolo Dai Pra,
More informationSoftware Reliability & Testing
Repairable systems Repairable system A reparable system is obtained by glueing individual non-repairable systems each around a single failure To describe this gluing process we need to review the concept
More informationStochastic Modelling Unit 1: Markov chain models
Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson
More informationRedundant Array of Independent Disks
Redundant Array of Independent Disks Yashwant K. Malaiya 1 Redundant Array of Independent Disks (RAID) Enables greater levels of performance and/or reliability How? By concurrent use of two or more hard
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationCost Analysis of a vacation machine repair model
Available online at www.sciencedirect.com Procedia - Social and Behavioral Sciences 25 (2011) 246 256 International Conference on Asia Pacific Business Innovation & Technology Management Cost Analysis
More informationDiagonalization by a unitary similarity transformation
Physics 116A Winter 2011 Diagonalization by a unitary similarity transformation In these notes, we will always assume that the vector space V is a complex n-dimensional space 1 Introduction A semi-simple
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationThis operation is - associative A + (B + C) = (A + B) + C; - commutative A + B = B + A; - has a neutral element O + A = A, here O is the null matrix
1 Matrix Algebra Reading [SB] 81-85, pp 153-180 11 Matrix Operations 1 Addition a 11 a 12 a 1n a 21 a 22 a 2n a m1 a m2 a mn + b 11 b 12 b 1n b 21 b 22 b 2n b m1 b m2 b mn a 11 + b 11 a 12 + b 12 a 1n
More informationAnalysis of a Machine Repair System with Warm Spares and N-Policy Vacations
The 7th International Symposium on Operations Research and Its Applications (ISORA 08) ijiang, China, October 31 Novemver 3, 2008 Copyright 2008 ORSC & APORC, pp. 190 198 Analysis of a Machine Repair System
More informationOPTIMIZATION BY SIMULATED ANNEALING: A NECESSARY AND SUFFICIENT CONDITION FOR CONVERGENCE. Bruce Hajek* University of Illinois at Champaign-Urbana
OPTIMIZATION BY SIMULATED ANNEALING: A NECESSARY AND SUFFICIENT CONDITION FOR CONVERGENCE Bruce Hajek* University of Illinois at Champaign-Urbana A Monte Carlo optimization technique called "simulated
More informationDependable Computer Systems
Dependable Computer Systems Part 3: Fault-Tolerance and Modelling Contents Reliability: Basic Mathematical Model Example Failure Rate Functions Probabilistic Structural-Based Modeling: Part 1 Maintenance
More informationA mathematical model for a copolymer in an emulsion
J Math Chem (2010) 48:83 94 DOI 10.1007/s10910-009-9564-y ORIGINAL PAPER A mathematical model for a copolymer in an emulsion F. den Hollander N. Pétrélis Received: 3 June 2007 / Accepted: 22 April 2009
More informationLecture 20 : Markov Chains
CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called
More informationSupermodular ordering of Poisson arrays
Supermodular ordering of Poisson arrays Bünyamin Kızıldemir Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological University 637371 Singapore
More informationThe Transition Probability Function P ij (t)
The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it
More informationA COMPLETELY MONOTONIC FUNCTION INVOLVING THE TRI- AND TETRA-GAMMA FUNCTIONS
ao DOI:.2478/s275-3-9-2 Math. Slovaca 63 (23), No. 3, 469 478 A COMPLETELY MONOTONIC FUNCTION INVOLVING THE TRI- AND TETRA-GAMMA FUNCTIONS Bai-Ni Guo* Jiao-Lian Zhao** Feng Qi* (Communicated by Ján Borsík
More informationMarkov Processes Hamid R. Rabiee
Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete
More informationStochastic Models: Markov Chains and their Generalizations
Scuola di Dottorato in Scienza ed Alta Tecnologia Dottorato in Informatica Universita di Torino Stochastic Models: Markov Chains and their Generalizations Gianfranco Balbo e Andras Horvath Outline Introduction
More informationChapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process
More informationLecture 9 Classification of States
Lecture 9: Classification of States of 27 Course: M32K Intro to Stochastic Processes Term: Fall 204 Instructor: Gordan Zitkovic Lecture 9 Classification of States There will be a lot of definitions and
More informationOutlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)
Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous
More informationMinimum Moment Steiner Trees
Minimum Moment Steiner Trees Wangqi Qiu Weiping Shi Abstract For a rectilinear Steiner tree T with a root, define its k-th moment M k (T ) = (d T (u)) k du, T where the integration is over all edges of
More informationConstruction of Smooth Fractal Surfaces Using Hermite Fractal Interpolation Functions. P. Bouboulis, L. Dalla and M. Kostaki-Kosta
BULLETIN OF THE GREEK MATHEMATICAL SOCIETY Volume 54 27 (79 95) Construction of Smooth Fractal Surfaces Using Hermite Fractal Interpolation Functions P. Bouboulis L. Dalla and M. Kostaki-Kosta Received
More informationA Markov model for estimating the remaining life of electrical insulation in distribution transformer
AMERICAN JOURNAL OF SCIENTIFIC AND INDUSTRIAL RESEARCH 2010, Science Huβ, http://www.scihub.org/ajsir ISSN: 2153-649X doi:10.5251/ajsir.2010.1.3.539.548 A Markov model for estimating the remaining life
More informationQ = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?
IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross
More informationMulti-State Availability Modeling in Practice
Multi-State Availability Modeling in Practice Kishor S. Trivedi, Dong Seong Kim, Xiaoyan Yin Depart ment of Electrical and Computer Engineering, Duke University, Durham, NC 27708 USA kst@ee.duke.edu, {dk76,
More informationReliability of Coherent Systems with Dependent Component Lifetimes
Reliability of Coherent Systems with Dependent Component Lifetimes M. Burkschat Abstract In reliability theory, coherent systems represent a classical framework for describing the structure of technical
More informationA FRAMEWORK FOR PERFORMABILITY MODELLING USING PROXELS. Sanja Lazarova-Molnar, Graham Horton
A FRAMEWORK FOR PERFORMABILITY MODELLING USING PROXELS Sanja Lazarova-Molnar, Graham Horton University of Magdeburg Department of Computer Science Universitaetsplatz 2, 39106 Magdeburg, Germany sanja@sim-md.de
More informationPart II: continuous time Markov chain (CTMC)
Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1
More informationTail Dependence of Multivariate Pareto Distributions
!#"%$ & ' ") * +!-,#. /10 243537698:6 ;=@?A BCDBFEHGIBJEHKLB MONQP RS?UTV=XW>YZ=eda gihjlknmcoqprj stmfovuxw yy z {} ~ ƒ }ˆŠ ~Œ~Ž f ˆ ` š œžÿ~ ~Ÿ œ } ƒ œ ˆŠ~ œ
More information