STOCHASTIC ADDING MACHINES BASED ON BRATTELI DIAGRAMS. Danilo Antonio Caprio. Ali Messaoudi. Glauco Valle
|
|
- Sylvia Shaw
- 5 years ago
- Views:
Transcription
1 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS Danilo Antonio Caprio UNESP - Departamento de Matemática do Instituto de iociências, Letras e Ciências Exatas. Rua Cristóvão Colombo, 65, Jardim Nazareth, 554- São José do Rio Preto, SP, rasil. Ali Messaoudi UNESP - Departamento de Matemática do Instituto de iociências, Letras e Ciências Exatas. Rua Cristóvão Colombo, 65, Jardim Nazareth, 554- São José do Rio Preto, SP, rasil. Glauco Valle Universidade Federal do Rio de Janeiro - Instituto de Matemática. Caixa Postal 6853, cep , Rio de Janeiro, rasil. Abstract. In this paper, we dene some Markov Chains associated to Vershik maps on ratteli diagrams. We study probabilistic and spectral properties of their transition operators and we prove that the spectra of these operators are connected to Julia sets in higher dimensions. We also study topological properties of these spectra.. Introduction Let g be a homomorphic map on C d, where d is an integer. The set K(g) of z C d such that the forward orbit {g n (z) : n N} is bounded is called the (ddimensional) lled Julia set of g. Filled Julia sets and their boundaries (called Julia sets) were dened independently by Julia and Fatou ([5] and [6], [3] and [4]). The study of Julia sets is connected to many areas of mathematics as dynamical systems, complex analysis, functional analysis and number theory, among others (see for example [6], [8], [9], [], [], [4], [7], [], [5], [7], [8], [33], [37]). There is an important connection between Julia sets and stochastic adding machines. A rst example was given by Killeen and Taylor in [6] as follows: let n be a nonnegative integer and write it in a unique way in base as n = k i= ε i(n) i = ε k... ε, for some k, where ε k = and ε i {, }, for all i {,..., k }. 99 Mathematics Subject Classication. Primary: 37A3, 37F5; Secondary: 6J, 47A. Key words and phrases. Markov chains, stochastic Vershik map, ratteli diagrams, spectrum of transition operators, bered Julia sets. Supported by FAPESP grant 5/66 6. Supported by CNPq grant 37776/5 8 and FAPESP project 3/ Supported by FAPERJ grants E 6/3.48/6 and CNPq grants 3585/5 and 4383/6.
2 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE It is known that the addition of is given by a classical algorithm, namely n + = ε k... ε l+ (ε l + )... where l = min{i : ε i (n) = }. Killeen and Taylor dened the stochastic adding machine assuming that each time a carry should be added, it is added with probability < p < and it is not added with probability p. Moreover, the algorithm stops when the rst carry is not added. So this random algorithm maps n = ε k... ε to n itself with probability p, to n + with probability p l+ and to m = n r + = ε k... ε d+... ε r... with probability p r ( p). With this they obtained a countable Markov chain whose associated transition operator S = (p i,j ) i,j N is a bistochastic innite matrix whose spectrum is equal to the lled Julia set of the quadratic map z ( p), z C. p In [9], [3], [3] and [3], stochastic adding machines based on other systems of numeration have been introduced. They are connected to one-dimensional bered Julia sets (see [9]) and also to Julia sets in dimension greater than one ([7], [3] and [3]). A d-dimensional bered lled Julia set of a sequence (g j ) j of homomorphic maps on C d is the set K((g j ) j ) of z C d such that the forward orbit { g j (z) : j N} is bounded, where g j = g j g j... g for all j. In this paper, we introduce stochastic adding machines associated to Vershik maps on ratteli diagrams. ratteli diagrams are important objects in the theories of operator algebras and dynamical systems. It was originally dened in 97 by O. ratteli [3] for classication of C -algebras. ratteli diagrams turned out to be a powerful tool in the study of measurable, orel, and Cantor dynamics (see [8], [], [8], [35]). The interest on ratteli diagrams is that any aperiodic transformation in measurable, orel, and Cantor dynamics can be realized as a Vershik map acting on the path space of a ratteli diagram (see [], [], [8], [35], [36]). A particular application arises when we use the Vershik map to embed Z + into the set of paths of the associated ratteli diagram. This embedding allows us to consider the restriction of the Vershik map on that copy of Z + as the map n n +. It also allows a representation of systems of numeration through ratteli diagrams, making possible for us to introduce more general stochastic adding machines. Indeed we are able to dene a more general Markov process on the set X of innite paths on the ratelli diagram whose restriction to the copy of Z + is the stochastic adding machine, we call this process the "ratteli-vershik process" or simply V process and the associated Stochastic adding machine the ratteli-vershik stochastic adding machine or simply V stochastic adding machine. We will give necessary and sucient conditions that assure transience or recurrence of the V stochastic adding machines. We will also prove that the spectrum of the V stochastic adding machine transition operator S (acting on l ) is related to bered lled Julia sets in higher dimension. ( For example, ) if the ratteli diagram is a b stationary and its incidence matrix is M = where a, b, c, d are nonnegative c d integers, then the point spectrum of the transition operator of the ratteli-vershik
3 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 3 stochastic adding machine associated to M is related to the Julia set K := {(x, y) C : (g n... g (x, y)) n is bounded}, ( ) where g n (x, y) = p n+ x a y b p n+ p n+, p n+ x c y d p n+ p n+ and < p n+ <, for all n. Just to mention an important connection, the study of these spectra gives information about the dynamical properties of transition operators acting on separable anach spaces (see for instance [] and [9]). For example, if T is topologically transitive, then any connected component of the spectrum intersects the unit circle. However, here we do not aim at the study of the dynamical properties of the transition operators. We will also study topological properties of this spectrum. The paper is organized as follows. In Section we give a background about ratteli diagrams and we dene the Vershik map. In Section 3 we dene the V processes and the V stochastic adding machines giving necessary and sucient conditions for transience, null recurrence and positive recurrence. Section 4 is devoted to provide an exact description of the spectra of the transition operators of V stochastic machines acting on l (N) in the case of ratteli diagrams. Furthermore, we prove some topological properties of this spectrum. Section 5 describes generalization to l l, l 3, ratteli diagrams.. ratteli diagrams.. asics on ratteli diagrams. In this section we introduce the necessary notation on ratteli diagrams. Here we follow [] and [] and we recommend both texts as a reference for the interested reader. Denition.. A ratteli diagram is an innite directed graph (V, E) where the vertex set V and the edge set E can be partitioned into nite sets, i.e V = k=v (k) and E = k=e(k), where #V (k) < and #E(k) < for every k, such that there exist maps s : E V and r : E V such that s restricted to E(k) is a sujective map from E(k) to V (k ) and r restricted to E(k) is a sujective map from E(k) to V (k) for every k. For every e E we call s(e) the source of e and r(e) the range of e (see Figure ). For convenience if #V (k) = l we denote V (k) = {(k, ),..., (k, l)} or simply V (k) = {,..., l} when there is no possibility of misidentication of the value of k. Remark.. It is usual to dene the ratteli diagrams under the condition that V () is one point set, i.e V () = {v()}. Our denition is more suitable to the understanding of stationarity and more appropriated to the discussion of the results in this paper. However we could also use that condition in the denition without any prejudices to the results in this paper.
4 4 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE It is convenient to give a diagrammatic representation of a ratteli diagram considering V (k) as a "horizontal" level k, and the edges in E(k) heading downwards from vertices at level k to vertices at level k. Also, if #V (k ) = l(k ) and #V (k) = l(k), then E(k) determines a l(k) l(k ) incidence matrix M(k) (see Figure ), where M(k) i,j is the number of the edges going from vertex j in V (k ) to vertex i in V (k). y denition of ratteli diagrams, we have that M(k) has non identically zero lines and columns. LEVEL INCIDENCE MATRICES n- n E(n) s(e) V(n-) V(n) n+ E(n+) e r(e) V(n+) Figure. Diagrammatic representation between the levels n and n + in a ratteli diagram. Let k, k Z + with k < k and let E(k + ) E(k + )... E( k) denote the set of paths from V (k) to V ( k). Specically, E(k + )... E( k) denote the following set: {(e k+,..., e k) : e i E(i), k + i k, r(e i ) = s(e i+ ), k + i k }. The incidence matrix of E(k + )... E( k) is the product M( k)... M(k + ). We dene r(e k+,..., e k) := r(e k) and s(e k+,..., e k) := s(e k+ ). Denition.3. We say that (V, E) is a simple ratteli diagram if for each nonnegative integer k, there exists and integer k > k such that the product M( k)... M(k+) have only non-zero entries... Ordered ratteli diagrams. Denition.4. An ordered ratteli diagram (V, E, ) is a ratteli diagram (V, E) together with a partial order on E such that edges e, e E are comparable if and only if r(e) = r(e ), in other words, we have a linear order on the set r ({v}) for each v V \ V () (see an example in Figure ).
5 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 5 Figure. An order in ratteli diagram of Figure. Remark.. Edges in an ordered ratteli diagram (V, E, ) are uniquely determined by a four dimensional vector e = (k, s, m, r), where k means that e E(k), s = s(e) and r = r(e) are the source and range of e as previously dened and m Z + is the order index means that e = e m r (r(e)) = {e < e <... < e r }. Usually we will write e = e k = (s, m, r) carrying the level index k as a subscript or suppressing it when there is no doubt about the level. Note that if (V, E, ) is an ordered ratteli diagram and k < k in Z +, then the set E(k + ) E(k + )... E( k) of paths from V (k) to V ( k) may be given an induced order as follows: (e k+, e k+,..., e k) > (e k+, e k+,..., e k) if and only if for some i with k + i k, e i > e i and e j = e j for i < j k. Denition.5. A ratteli diagram (V, E) is stationary if there exists l such that l = #V (k) for all k, and (by an appropriate relabelling of the vertices if necessary) the incidence matrices between level k and k + are the same l l matrix M for all k. In other words, beyond level the diagram repeats itself. An ordered ratteli diagram = (V, E, ) is stationary if (V, E) is stationary, and the ordering on the edges with range (k, i) is the same as the ordering on the edges with range ( k, i) for k, k and i =,..., l. In other words, beyond level the diagram with the ordering repeats itself. We still need a denition that will be useful to deal with examples. Denition.6. Let = (V, E, ) be an ordered ratteli diagram. We say that is a consecutive ordering if for all edges e f e with s(e) = s(e ) we have s(f) = s(e) = s(e ). To every ordered ratteli diagram with consecutive ordering = (V, E, ) we associate a sequence of matrices (Q(k)) k called the ordering matrices such that (i) Q(k) is a (l(k)) (l(k )) matrix; (ii) Q(k) i,j = if and only if M(k) i,j = ;
6 6 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE (iii) The non zero entries in each line i of Q(k) form a permutation in #{j : M(k) i,j > } letters. So line i in Q(k) indicates how edges inciding on vertex i V (k) are ordered with respect to its sources in V (k ). The consecutive ordering is said to be canonical if each line of Q(k), k, the permutation in #{j : M(k) i,j > } letters is the identity. For a stationary ordered ratteli diagram, the consecutive ordering is also stationary, i.e Q = Q(k) for every k. As an example consider a stationary ordered ratteli diagram with l = and incidence matrix ( ) a b M =, c with abc >. We have two possible consecutive ordering relative to the ordering matrices ( ) ( ) or, where the rst one is associated to the canonical consecutive ordering..3. The Vershik map. Let = (V, E, ) be an ordered ratteli diagram. Let X denote the associated innite path space, i.e. X = {(e, e,...) : e i E(i) and r(e i ) = s(e i+ ), for all i }. Under the hypotheses of Denition., X is non empty. However X can be a nite set, this only occurs in trivial cases and do not occur for general classes of ratteli diagrams as for instance simple ratelli diagrams with #E(k) > for innitely many k. Hence we require that X is innite for all ratteli diagrams considered here. We endow X with a topology such that a basis of open sets is given by the family of cylinder sets [e, e,..., e k ] = {(f, f,...) X : f i = e i, for all i k}. Each [e,..., e k ] is also closed, as is easily seen. Endowed with this topology, we call X the ratteli compactum associated with = (V, E, ). Let d be the distance on X dened by d ((e j ) j, (f j ) j ) = where k = inf{i : e k i f i }. The topology of the cylinder sets coincide with the topology induced by d. If (V, E) is a simple ratteli diagram, then X has no isolated points, and so is a Cantor space (see [8]). Two paths in X are said to be conal if they have the same tails, i.e. the edges agree from a certain level. Let x = (e, e,...) be an element of X. We will call e k = e k (x) the kth label of x. Recall from Remark. that e k = (s k, m k, r k ) such that r k = s k+ V (k) for every k. We let X max denote those elements x of X such that e k (x) is a maximal edge for all k and X min the analogous set for the minimal edges. It is clear that from any vertex at level k there is an upward maximal path to level, using this we have that
7 X max X min STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 7 is the intersection of non-empty compact sets, so it is non-empty. Analogously is non-empty. From now on we denote X := X \ X max. If = (V, E, ) is an ordered ratteli diagram then it's easy to check that every innite path x X has an unique successor, i.e. the set {y X : y > x} has a smallest element. Indeed let x = (e, e,...) X and ζ(x) be the smallest number such that e ζ is not a maximal edge. Let f ζ = f ζ (x) be the successor of e ζ (and so r(e ζ ) = r(f ζ )). Then the successor of x is V (x) = y = (f,..., f ζ, f ζ, e ζ+,...), where (f,..., f ζ ) = (f (x),..., f ζ (x)) is the minimal path in E() E()... E(ζ ) with range equal to s(f ζ ), i.e. r(f,..., f ζ ) = s(f ζ ). Denition.7. The Vershik map of an ordered ratteli diagram = (V, E, ) is the map V : X X that associates to each x X its successor. We call the resulting pair (X, V ) a ratteli-vershik dynamical system. 3. The ratteli-vershik process and stochastic machine Here we will dene the V process but we need to introduce some new notation before it. Let = (V, E, ) be an ordered ratteli diagram. Recall the denition of ζ(x), for x X, from the previous section and dene A(x) = { i < ζ(x) : e i (x) is not a minimal edge}. Put θ(x) = #A(x) and write A(x) = {k x,,..., k x,θ(x) }, where k x,i < k x,i, for all i {,..., θ(x)}. Since for k A(x) we have that e k (x) is a maximal edge of x which is not minimal which implies that e k (x) is not the only edge arriving at r(e k (x)). Thus if #r (v) > for every v V {v } or equivalently the sum of each line in each incidence matrix is greater than one, then we have that θ(x) = ζ(x) and A(x) = {,..., ζ(x) }. So we have Hypothesis A: For the ordered ratteli diagram = (V, E, ), the sum of each line in each incidence matrix is greater than one. For each j {,..., θ(x)}, let y j (x) X be dened as (3.) y j (x) = (f (j),..., f (j) k x,j, e kx,j +, e kx,j +,...), where (f (j),..., f (j) k x,j ) is the minimal edge in E()... E(k x,j ) with range equal to s(e kx,j +), for each j {,..., θ(x)}.
8 8 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE First we need to adjust the space where the V process will be dened. This is due to the fact that the successor of x X can be an element of Xmax. To avoid this we max dene X as the set of points x X that are conal with a point on X max. Set max X := X \ X. Note that if x X then V (x) X. Moreover V restricted to X is one to one from X to X X min. Denition 3.. Let (p i ) i be a sequence of non-null probabilities and = (V, E, ) an ordered ratteli diagram. The ratelli-vershik Process is a discrete timehomogeneous Markov Process (Γ n ) n with state space X dened as Γ n = V (n) (Γ ), where V (n) is the n-th interation of V : X X called the random Vershik map and dened as y j (x), with probability p kx,... p kx,j ( p kx,j+ ), for each j {,..., θ(x) }; V (x) = y θ(x) (x), with probability p kx,... p kx,θ(x) ( p ζ(x) ), x, with probability p kx, ; V (x), with probability p kx,... p kx,θ(x) p ζ(x). Thus the transition probabilities of the V process is determined by the random Vershik map. The idea behind the denition is the use of a basic algorithm to obtain V (x) from x by recursively choosing the minimum path from level to level k for k ζ(x) and then at step ζ(x) we nally obtain V (x). Then we impose the rule that step j of the algorithm is performed with probability p j independently of any other step. This transition mechanism is connected to the stochastic adding machines discussed in Section and our next aim is to dene the V stochastic adding machine. Remark 3.. Under Hypothesis A we have that y j (x), with probability p... p j ( p j+ ), for each j {,..., ζ(x) }; V (x) = x, with probability p ; V (x), with probability p... p ζ(x) p ζ(x). Take x X X min and dene X x a bijection between X x V (n) := {x } {V (n) (x ) : n }. Clearly we have and the set of non-negative integers Z + where x and (x ) n for all n. Using the fact that x X min to verify that for every x X x we have V (x) X x To simplify the notation, we put x n := V (n) (x ) and then X x, it is also straightforward with probability one. = {x, x, x,...}.
9 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 9 Denition 3.. Let (p i ) i be a sequence of non-null probabilities, = (V, E, ) be an ordered ratteli diagram and x X X min. The ratelli-vershik stochastic adding machine associated to them is the discrete time-homogeneous Markov chain (Y n ) n on X x dened as Y n = Γ n for n given that Y = x. Let (Y n ) n be a V stochastic adding machine, we will denote the transition matrix of (Y n ) n by S = (S m,n ) m,n N, i.e (3.) S m,n := S(x n, x m ) := P (Y = x n Y = x m ). When X min = {x min } is an unitary set, there is a unique V stochastic adding machine associated to and a given sequence (p i ) i. This stochastic machine is the main object of study in this paper. To simplify notation we write X x min = X. The hypothesis X min = {x min} is a natural one and occurs when the level sets V k are ordered and the order on the edges is endowed by the order on its source level sets. Example 3.3. (The Cantor systems of numeration case) Consider the ordered ratteli diagram represented by the sequence of matrices M j = (d j ) for a sequence d j for every j. In this case we have an unique ordering which is the canonical consecutive ordering. Moreover Hypothesis A is clearly satised. In this case, X min is unitary and given (d j ) j and (p j ) j there is a unique associated V stochastic adding machine. The stochastic adding machines associated to the Cantor systems of numeration were introduced by Messaoudi and Valle [3]. For instance consider d j = j, for all j. Let x = (e, e, e 3, e 4,...) X, where e = (,, ), e = (, 3, ) and e 3 = (, 4, ). A representation of the path (e, e, e 3 ) in the diagram is presented in item (a) of Figure 4. Here we have ζ(x) = 3, because e and e are maximal edges and e 3 is not maximal. Thus V (x) = (f, f, f 3, e 4, e 5,...) where f = (,, ), f = (,, ) and f 3 = (, 5, ). (see the item b) of Figure 4). Moreover, we have A(x) = {, } and y (x) = (f, e, e 3, e 4,...) and y (x) = (f, f, e 3, e 4,...) (see the items (c) and (d) of Figure 4, respectively). We have that x transitions to V (x) with probability p p p 3, x transitions to x with probability p, x transitions to y (x) with probability p ( p ) and x transitions to y (x) with probability p p ( p 3 ). The initial parts of the transition graph and matrix for the chain are represented in Figure 3. Remark 3.. In Example 3.3, if d j = for all j, then we obtain the stochastic adding machine dened by Killeen and Taylor [6]. Example 3.4. Consider as the ( stationary ) ratteli diagram with consecutive ordering and incidence matrix M =. This diagram satises Hypothesis A. 3 Let x = (e, e, e 3, e 4, e 5,...) X be an innite path, where e = (, 3, ), e = (,, ), e 3 = (,, ), e 4 = (,, ) and e j = (,, ) for j 5. The representation of x in the diagram is given by the path in item (a) of Figure 6.
10 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE Figure 3. Initial parts of the transition graph and matrix of the V stochastic adding machine with incidence matrices M j = (d j ) where d =, d = 4 and d 3 = 6. a) b) c) d) E() E() E(3) Figure 4. Representation paths in a ratteli diagram with incidence matrices M j = (d j ) where j, d =, d = 4 and d 3 = 6. Here we have ζ(x) = 3 and V (x) = (f, f, f 3, e 4, e 5,...) where f = (,, ), f = (,, ) and f 3 = (,, ). (see item (b) of Figure 6). Moreover, we have A(x) = {, } and y (x) = ((,, ), e, e 3,...) and y (x) = ((,, ), (,, ), e 3, e 4,...) (see the (c) and (d) of Figure 6, respectively).
11 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS Hence, we have that x transitions to V (x) with probability p p p 3, x transitions to x with probability p, x transitions to y (x) with probability p ( p ) and x transitions to y (x) with probability p p ( p 3 ). Thus, its transition graph and transition operator are represented in Figure 5. Figure 5. Initial parts of the transition graph and matrix of the V stochastic adding machine associated with a stationary ratteli diagram with incidence matrix M. Example 3.5. (The Fibonacci case) Consider the stationary ordered ratteli ( diagram ) with the canonical consecutive ordering and incidence matrix M F =. In this case does not satisfy Hypothesis A. Again X min is unitary and given (p j ) j there is a unique associated V stochastic adding machine. These stochastic adding machines is associated with the Fibonacci system of numeration and have been introduced in [3] Let x = (e, e, e 3, e 4,...) X be an innite path in the ratteli diagram, where e = (,, ), e = (,, ), e 3 = (,, ), and e j = (,, ) for all j 4. The representation of x in the diagram is given by the continuous path in item (a) of Figure 7. We have ζ(x) = 4 and V (x) = (f, f, f 3, f 4, e 5,...) where f 4 = (,, ) and (f, f, f 3 ) is the minimal edge in E() E() E(3) with range equal to s(f 4 ). (see the item (b) of Figure 7). We have A(x) = {, 3} = {n, n } and y n (x) = ((,, ), (,, ), (,, ), e 4,...) and y n (x) = ((,, ), (,, ), (,, ), e 4,...) (see the items (c) and (d) of Figure 7, respectively).
12 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE E() E() E(3) a) b) c) d) 3 E(4) Figure 6. Representation of paths in a stationary ratteli diagram with incidence matrix M. Hence, we have that x transitions to V (x) with probability p p p 3, x transitions to x with probability p, x transitions to y n (x) with probability p ( p ) and x transitions to y n (x) with probability p p ( p 3 ). E() E() E(3) a) b) c) d) E(4) Figure 7. Representation of paths in a stationary ratteli diagram with incidence matrix M F.
13 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 3 Remark 3.3. Two distinct ordered ratteli diagrams can generate the same stochastic adding machine. For instance consider two stationary ordered ratteli( diagrams ) with consecutive ordering and incidence matrices M = () and M =. oth diagrams generate a unique V stochastic adding machine that corresponds to the stochastic machine studied by Killeen and Taylor in [6]. efore we discuss the probabilistic properties of the V stochastic adding machines, we present some basic denitions from the theory of Markov chains and we recommend [4] to the unfamiliar reader. Let Y = (Y n ) n be a Markov Chain on a probability space (Ω, O, P ). We denote by E[ ] the expectation with respect to P. We say that Y is irreducible if for any pair of states i and j there exists m such that P (Y m = j Y = i) >. An irreducible Markov chain Y is transient if every state i is transient, i.e. P {Y n = i for some n Y = i} <, is the probability that starting in state i, the process will ever re-enter state i. If an irreducible Markov chain is not transient we say that it is recurrent and this means that every state i is recurrent, i.e. P {Y n = i for some n Y = i} =. Furthermore, a recurrent Markov chain is called positive recurrent if for each state i, the expected return time m i = E[R i Y = i] <, where R i = min{n : Y n = i}. Otherwise, if m i = +, then the Markov chain is called null recurrent. Proposition 3.6. Let (p i ) i be a sequence of non-null probabilities such that #{i : p i < } =. Every V stochastic adding machine associated to (p i ) i is an irreducible Markov chain. Furthermore then stochastic machine is transient if and only if j= p j >. Proof. Let (Y n ) n be a V stochastic adding machine associated to (p i ) i, an ordered ratteli diagram = (V, E, ) and x X X min. We have some special states x n, x n,..., which are conal to x by hypothesis, determined by the following: e k (x nj ) = e k (x ) for k j + and (e (x nj ),..., e j (x nj )) is the maximal edge in E()... E(j) with range equal so s(e j+ (x )). Concerning irreducibility, we just point out that (i) for every n the chain can reach x n with positive probability by making the transitions x x, x x,..., x n x n ; (ii) for j + {i : p i < }, we can make the transition x nj x with probability ( p j+ ) j i= p j >.
14 4 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE y (i) and (ii), it is clear that (Y n ) n is irreducible. Now we consider the transience/recurrence of the chain. We rely on some additional properties of the chain related to the special states x nj, j. We have (iii) Once the chain arrives at x nj +, the successor of x nj, it can only visit x nj again if it visits x rst. (iv) If transition x x is possible with positive probability, then x = x nj. (v) Given that a transition from x nj to x nj + or x occurs, the next state of the chain is x nj + with probability p j+, i.e P ( Y n+ = x nj + Yn = x nj, Y n+ {x, x nj +} ) = p j+. The verication of (iii), (iv), (v) follows directly from the denition of (Y n ) n. y the Markov property and properties (i)-(v) above, the probability that the (Y n ) n never returns to x coincide with the event that (Y n ) n reach x nj before it returns to x for every j which has probability j= p j >. Remark 3.4. Let (Y n ) n be an irreducible V stochastic adding machine. If p < then clearly (Y n ) n is aperiodic since P (Y = x Y = x ) = p >. However, when p = the chain can be periodic or aperiodic depending on the ratteli diagram. Proposition 3.7. Let be an ordered ratteli diagram satisfying Hypothesis A and (p i ) i be a sequence of non-null probabilities such that #{i : p i < } = and j= p j =. Then every V stochastic adding machine associated to (p i ) i is null recurrent. Proof. Let (Y n ) n be a V stochastic adding machine associated to (p i ) i, an ordered ratteli diagram = (V, E, ) and x X X min. Suppose that = (V, E, ) satises Hypothesis A, #{i : p i < } = and j= p j =. y Proposition 3.6 the chain is irreducible and recurrent. Put T = inf{n : Y n = x }, i.e the rst return time to x. We are going to show that the expected value of T, E[T ], is innite and then the chain is null recurrent. To compute E[T ] we need to recall the denition of the special states x nj, j, and their properties from the proof of Proposition 3.6. Also recall the denition of the transition probabilities under Hypothesis A from Remark 3.. Put x n := x and consider the following decomposition T = T I {YT =x nj }, where I W is the indicator function of the event W. We obtain that (3.3) E[T ] = E[T I {YT =x nj }]P (Y T = x nj ). n= n=
15 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 5 Clearly on {Y T = x n } we have T = and P (Y T = x n ) = p. Using item (v) in the proof of Proposition 3.6 we get that ( j (3.4) P (Y T = x nj ) = p j )( p j+ ). We also have that i= (3.5) E[T I {YT =x n }] =. Claim: For every j E[T I {YT =x nj }] + j ( i ) p r Suppose that the claim holds. Then by (3.3) and (3.4) we have that (3.6) (3.7) i= E[T ] ( p ) + ( + ) p ( p ) p ( ) (3.8) p p ( p 3 ) + p p p Rearranging terms and putting p = we obtain (3.9) E[T ] p m...p m+j ( p m+j ) (3.) = m= j= ( m= j m+ p j ) = r= =. Thus the chain is null recurrent. It remains to prove the Claim. We prove it by induction. Suppose the claim holds for j (the case j = is (3.5)). Given {Y T = x nj } write T = T + T where T is the time of the rst visit of the chain to x nj + and T the time spent on {x nj +,..., x nj } until it arrives at x. y the induction hypothesis It remains to prove that m= j ( i ). E[T I {YT =x nj }] + p r i= r= ( j ). E[T I {YT =x nj }] p r Time T is greater or equal to the number of transitions to get to x from x nj, and this is bounded below by the necessary number of trials from j independent ernoulli random variables with parameters p,...,p j to obtain j successes. It is an exercise r=
16 6 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE in probability theory using geometric random variables to prove that this number of trials have expected value equal to ( j r= p r). From the proof of Proposition 3.7 we see that we can drop Hypothesis A if the sequence (p i ) i is constant and the ratteli diagram is stationary. Proposition 3.8. Let be a stationary ordered ratteli diagram. If p i = p (, ) for every i, then every V stochastic adding machine associated to (p i ) i is null recurrent. Although we have Propositions 3.7 and 3.8, a V stochastic adding machine associated to (p i ) i such that j= p j = can be positive recurrent. So Hypothesis A is necessary. In Example 3.5 we describe a stationary V stochastic adding machine associated to an ordered ratteli diagram with consecutive ordering which can be positive recurrent for a suciently fast decreasing sequence (p i ) i. 4. Stochastic machines of stationary ratteli diagrams Let = (V, ( E, ) ) be a stationary simple ordered ratteli diagram with incidence a b matrix M =. c d Since in simple, we have necessarily b > and c >, moreover either a > or d >. We can change the labels of vertices in if necessary and suppose that a >. Therefore a + b > and Hypothesis A is equivalent to c + d >. We start with a Proposition that gives a condition on ratteli diagrams that allows the existence of positive recurrent V stochastic adding machines. Proposition 4.. Let = (V, E, ) be a stationary simple ordered ratteli diagram with a = c =, b > and d =. Then the V stochastic adding machine associated to (p j ) j is positive recurrent if p j decreases to zero suciently fast as j. Proof. Recall the denitions from the proof of Proposition 3.7. In order to prove that the stochastic machine is positive recurrent we have to show that E[T ] <. We claim that there exists (C j ) j depending on b but not on (p j ) j such that (4.) E[T ] C + C j max{p j, p j }. j= From the previous inequality, one simply need to choose p j r j /(C j + C j+ ) with j r j <. To prove (4.) we use (3.3) and (3.4). So we need to bound from above the conditional expectation E[T I {YT =x nj }]. The particular form of x nj is important here. We have that x n = ( (, b, ), (,, ), (,, ),... ),
17 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 7 thus the time to get to x n + from x given Y T = x nj is equal to one plus a negative binomial distribution with parameters b and p because the chain uses one unit of time to leave x and then spend a geometric time of parameter p on each of the last b edges of E() with range V (). Therefore and E[T I {YT =x n }] = + b p E[T I {YT =x n }]P (Y T = x n ) + b = C. efore we can use induction on j we still need to deal with E[T I {YT =x n }] and we need to compute the mean time to get to x n + from x n +. We have x n + = ( (,, ), (,, ), (,, ), (,, )... ), where the rst edge is the unique edge in E() with range. So from x n + we only need to change b edges in E() to get to x n + and on each of these edges we spend a geometric time of parameter p. Therefore E[T I {YT =x n }] = E[T I {YT =x n }] + b p = + b p + b p, and E[T I {YT =x n }]P (Y T = x n ) is bounded above by p p + bp + bp ( + b) max{p, p } = C max{p, p }. Analogous estimates allow us to show that E[T I {YT =x n3 }]P (Y T = x n3 ) is bounded above by Now Suppose that p p p 3 + p p 3 b + p p 3 b + p b ( + b + b ) max{p, p 3 }. E[T I {YT =x nj }]P (Y T = x nj ) C j max{p j, p j }, and we are going to estimate E[T I {YT =x nj+ }]. Using the fact that a = c = to go from x nj+ + to x nj+ + we need to change b edges in E(j + ) without change the edge (j +,,, ) E(j + ) but considering all edges in E()... E(j) with range V (j). Thus b E[T I {YT =x nj+ }] E[T I {YT =x nj }] + (E[T I {YT =x nj+ }] ). p j+ Thus E[T I {YT =x nj+ }]P (Y T = x nj+ ) is bounded above by C j max{p j, p j }p j+ + C j max{p j, p j }p j+ C j max{p j+, p j+ }. So we just need to take C j+ = C j.
18 8 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE From the proof of proposition 4. we can also see that it is enough to have j p j summable to obtain a positive recurrent stochastic machine from the hypothesis of the proposition. As a corollary we get the result from [7] about the existence of positive recurrent Fibonacci stochastic adding machines. Corollary 4.. The Fibonacci stochastic adding machines associated to (p j ) j are positive recurrent if p j decreases to zero suciently fast as j. To continuing the study of V stochastic machines of ratteli diagrams, we need to introduce some notation related to systems of numeration associated to the ratteli diagrams. Let us denote M n by ( ) M n an b = n, c n d n for all n, where M = I is the identity matrix. For each n, put F n = a n + b n, G n = c n + d n n. This gives F = G =, F = a + b and G = c + d. Remark 4.3. For each nonnegative integer n, F n is the number of paths from V () to the vertex (n, ) at the ratteli diagram. Respectively, G n is the number of paths from V () to the vertex (n, ). Lemma 4.4. We have F n+ = (a + d)f n (ad bc)f n and G n+ = (a + d)g n (ad bc)g n, for all n. ( ) ( ) Fn Proof. It comes to the fact that = M G n for all n and the characteristic polynomial of M is p(x) = x + (a + d)x (ad bc). n 4.. case under Hypothesis A and consecutive ordering. From now on we assume that abc >, c + d > and that is endowed with the consecutive ordering. Thus is simple and satises Hypothesis A. Moreover X min = {x } is a unitary set and for each x X we have A(x) = {,..., ζ(x) }. The aim of this section is the study of the spectrum of V stochastic machines under these conditions. We rst need to establish a proper notation to deal with the possible transitions of the chain in X = X x. Dene j as the minimum edge of E(j) with range, i.e. j = (j,,, ). For convenience we will sometimes not write the level index j simply writing = (,, ). Let x = (e j ) j = ((s j, m j, r j )) j X. Recall that x = ( j ) j and x x is conal with x, thus there exists N N such that x N = V N(x ) = x. Put ξ(x) = min{j : e l = for all l > j}.
19 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 9 The reader should recall the denition of ζ(x) and note that ζ(x) and ξ(x) play a dierent role. 4.. Numeration systems associated to ratteli diagrams. Denition 4.5. Let x X. For each j {,..., ξ(x)}, dene δ j (x) = δ j and γ j (x) = γ j according to the following four cases: (i) If s j = and r j = then δ j = m j {,..., a } and γ j = ; (ii) If s j = and r j = then δ j = a and γ j = m j a {,..., b }; (iii) If s j = and r j = then δ j = m j {,..., c } and γ j = ; (iv) If s j = and r j = then δ j = c and γ j = m j c {,..., d }. For x N = V N (x ) = x we also denote δ j = δ j (N) and γ j = γ j (N). Observe that m j = δ j + γ j, for all j. Moreover, if d =, then (s j, r j ) (, ), for all j. Example 4.6. Consider ( the) consecutive ordering ratteli diagram reprensented 3 by the matrix M =. y Lemma 4.4, we have 4 F =, F = 4, F = 9, F 3 = 9,... G =, G = 5, G = 4, G 3 = 5,... Consider x, y where x = (x j ) j = ((, 3, ), (, 4, ), (,, ),,,...) and y = (y j ) j = ((,, ), (,, ), (, 3, ),,,...). The representation of x and y in the ratteli diagram is given respectively in the items (a) and (b) of Figure 8. y Denition 4.5, we have that δ (x) =, γ (x) = ; δ (x) =, γ (x) = 3; δ 3 (x) =, γ 3 (x) = ; δ i (x) = γ i (x) =, for all i 4. and δ (y) =, γ (y) = ; δ (y) =, γ (y) = ; δ 3 (y) =, γ 3 (y) = ; δ i (y) = γ i (y) =, for all i 4. Proposition 4.7. Let N be a nonnegative integer and x X such that V N(x ) = x. Then, N = ξ(x) j= δ j+f j + γ j+ G j, where δ i (N) = δ i and γ i (N) = γ i are dened in Denition 4.5, for all i. Proof. Fix a nonnegative integer N and let x = V N (x ) = (e, e, e 3,...), with e i = (s i, m i, r i ) for all i. From Remark 4.3, we have that for each nonnegative integer k (4.) V F k (x ) = (,...,, ẽ, }{{} f,,,...) (k ) times where either ẽ = and f = (,, ) if a > or ẽ = (,, ) and f = (,, ) if a =. Thus, since e k = (s k, m k, r k ), it follows that if s k = and r k =, then a >,
20 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE a) b) E() E() E(3) E(4) m k {,..., a } and (4.3) Figure 8. Representation of a ratteli diagram. V δ kf k +γ k G k (x ) = V m kf k (x ) = (,...,, ẽ, (, m }{{} k, ),,,...), (k ) times and if s k = and r k =, then m k {a,..., a + b } and (4.4) V δ kf k +γ k G k (x ) = V af k +(m k a)g k (x ) = (,...,, ẽ, (, m }{{} k, ),,,...). (k ) times Now, consider k = ξ(x) = min{j : e l = for all l > j} and put N k = δ k F k + γ k G k. For each j {,..., k }, let N j = δ j F j + γ j G j + N j+ and x(j +) := (,...,, ẽ, e }{{} j+, e j+,..., e k,,,...). Suppose that V N j+ (x ) = x(j +), j times for some j {,..., k }. Here, we need to consider four cases: i) s j = and r j = ; iii) s j = and r j = ; ii) s j = and r j = ; iv) s j = and r j =.
21 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS For example, in the case ii) ( we have ) ẽ = (,, ), m j {a,..., a + b } and V N j (x ) = V af j +(m j a)g j V N j+ (x ) = V af j +(m j a)g j (x(j + )) = x(j). In the same way, we can check that V N j (x ) = x(j) for the other cases. y induction we have V N (x ) = x() = x and since δ i = γ i =, for all i > k, it k + follows that N = (δ i F i + γ i G i ) = (δ i F i + γ i G i ) = N. i= Remark 4.8. We believe that the last proposition is another formulation of Lemma 4 in [5], which gives a formula of the rst entrance time map. i= Remark 4.9. We call ((δ, γ ), (δ, γ ),...) the (F,G)-representation of N and we put N = ξ(x) j= δ j+f j + γ j+ G j = ((δ, γ ), (δ, γ ),...). The set of (F, G)-representations is recognized by a nite graph called automaton (see Figure 9). Figure 9. Automaton related to the (F, G)-representation of N = ((δ, γ ), (δ, γ ),...), where δ i {x a, x c } and γ i {y b, y d }, for all i with x a {,..., a }, x c {,..., c }, y b {,..., b } and y d {,..., d }. Remark 4.. In Example 4.6, it follows by Proposition 4.7 that L(N x ) = x and L(N y ) = y where N x = F + G + F + 3G + F + G = 65 and N y = F + G + F + G = 69. Example 4.. If M = (d) for d, by Proposition 4.7, we obtain the numeration in base d, with digits {,,..., d }. Remark 4.. We can dene the sequences (δ i (x)) i and (γ i (x)) i for all x X as done in Denition 4.5, in the case where the ( ratteli ) diagram does not satises a b Hypothesis A, i.e. the incidence matrix M =, where ab >. Furthermore, in this case δ i {,..., a} and γ i {,..., b }, for all i. Moreover, by Lemma 4.4, we have that G n = F n, for all n. y Proposition 4.7, the (F, G)- representation of N is given by the automaton represented in Figure. Observe in Figure that when b =, the representation of N is equal to ((δ, )(δ, ),...), with δ i δ i < lex a, for all i.
22 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE Figure. Automaton related to the (F, G)-representation, where x a {,..., a } and y b {,..., b } Spectrum of the stochastic machines of ratteli diagrams. We are nally in position to compute the spectrum of the transition operator (acting in l ) of the V stochastic adding machines associated to a stationary ratteli diagram endowed with the consecutive ordering. We denote the spectrum, point spectrum and approximate point spectrum of the transition operator S respectively by σ(s), σ pt (S) and σ a (S). Recall that λ belongs to σ(s) (resp. σ pt (S)) if S λi is not bijective (resp. not one-to-one). Also, λ σ a (S) if there exists a sequence (v n ) n such that v n =, for all n and (S λi)v n converges to when n goes to innity. For each λ C, let (u Fn (λ)) n = (u Fn ) n and (w Fn (λ)) n = (w Fn ) n be the sequences dened by u F = w F = λ ( p ) p and for all n. (4.5) u Fn = p n+ u a F n w b F n p n+ p n+, w Fn = p n+ u c F n w d F n p n+ p n+. From this, let (v n ) n be the sequence dened by v n = ξ(n) i= u δ i+ F i w γ i+ F i, where δ j = δ j (n) and γ j = γ j (n), j {,..., ξ(n)}, are given in denition 4.5. Since v Fn = u Fn, for all n, we will denote v n by u n. Theorem 4.3. Let S be the transition operator of a V stochastic machine associated to a ratteli diagram. Then, acting in l (N), we have that the set of eigenvalues of S is σ pt (S) = {λ C : (u n (λ)) n is bounded}. Remark 4.4. From Theorem 4.3, we deduce that σ pt (S) {λ C : (u Fn (λ)) n is bounded}. Moreover, if det M, we can show (see Proposition 4.7) that σ pt (S) E := {λ C : (u Fn (λ), w Fn (λ)) is bounded}. Since, g n... g (u F, u F ) = (u Fn, w Fn ), for all n, where g n : C C are polynomials dened by ( g n (x, y) = p n+ x a y b p n+ p n+, p n+ x c y d p n+ p n+ ), for all n, if follows that σ pt (S) is contained in the set {λ C : ( λ +p p, λ +p p K := {(x, y) C : (g n... g (x, y)) n is bounded}. ) K} where
23 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 3 This set is the -dimensional bered lled Julia set associated to (g n ) n (for more on bered Julia sets see [34] and references therein). In particular, if (p i ) i is constant, then K is a -dimensional lled Julia set. For the proof of Theorem 4.3, we need the following lemma. Lemma 4.5. For all z = (z i ) i l, ( ζn ) (Sz) N = p j z N+ + ( p )z N + j= ζ N r= ( r j= p j ) ( p r+ )z N r j= δ j+f j +γ j+ G j, if ζ N and (Sz) N = p z N+ + ( p )z N if ζ N =, where δ i, γ i are given in Denition 4.5, for all i {,..., ζ N }. Proof. Let N N and V N(x ) = x = (e, e, e 3,...). All we need to do is identify S N, Ñ for Ñ N. Let ξ(x) = k and ζ(x) = ζ N. Thus, x = (e,..., e ζn, e ζn, e ζn +,..., e k,,,...) and under Hypothesis A, we have that A(x) = {,..., ζ N }. From Denition 3. and Remark 3., we have that S N,N = p, S N,N+ = ζ N j= p j and S N, Ñ = if Ñ / {N, N +}. Thus, if ζ N =, we are done. Suppose that ζ N. For each i A(x), consider y i (x) dened by relation (3.). We can check that y i (x) = (,...,, ẽ, e }{{} i+, e i+,..., e ζn, e ζn, e ζn +,..., e k,,,...), i times where ẽ = (,, ) = if s i+ = and ẽ = (,, ) if s i+ =. For each i A(x), let N i N, such that V N i (x ) = y i (x). Thus, from Proposition 4.7, we have that N i = N i j= δ j+f j + γ j+ G j. Hence, from Remark 3., we have that S N,Ni = i j= p j( p i+ ). Furthermore S N, Ñ = if Ñ / {N, N +, N i, i A(x)} and the proof is nished. Our next step is to prove Theorem 4.3. The proof uses the same idea of the case M = (d), for d done in [9]. However, the extension is far from elementar. Proof of Theorem 4.3. Let z = (z N ) N be a sequence of complex numbers such that (Sz) N = λz N for every N. We shall prove that z N = u N z for all N. For this we need to have in mind the representation of N as a path in X, i.e. x = V N (x ) = (e,..., e ξ(x),,,...) where e j = (s j, m j, r j ), j ξ(x). The proof is based on the representation of Lemma 4.5. We use induction on N N. For N = we have by denition that δ =, γ = and δ j = γ j = for all j. Furthermore, λz = ( p )z + p z z = λ + p p z = u z.
24 4 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE Now x N = ((δ, γ ), (δ, γ ),...) and suppose that z j = u j z for all j {,..., N}. Suppose that ζ N =. Since z N = z ξ(x) i= u δ i+ F i w γ i+ F i = u δ F w γ F z ξ(x) i= u δ i+ F i w γ i+ F i, (Sz) N = p z N+ + ( p )z N = λz N and (4.6) u F = w F = λ ( p ) p, we have z N+ = u F z N = u δ +γ + F z ξ(x) i= u δ i+ F i w γ i+ F i. From here, we need to consider two cases: Case : if s =, then δ < a if r = and δ < c if r =. Furthermore, γ =. Thus, N + = ((δ +, γ ), (δ, γ ),...) and ξ(x) u N+ = u δ + F w γ F i= ξ(x) u δ i+ F i w γ i+ F i = u δ +γ + F i= u δ i+ F i w γ i+ F i. Case : if s =, then δ = a and γ < b if r = and δ = c and γ < d if r =. Thus, N + = ((δ, γ + ), (δ, γ ),...) and ξ(x) u N+ = u δ F w γ + F i= ξ(x) u δ i+ F i w γ i+ F i = u δ +γ + F i= u δ i+ F i w γ i+ F i. Hence, in both cases we have that z N+ = u N+ z. Now for ζ N we consider separately the cases d > and d =. Case d>: First, suppose that ζ N = (i.e. e = (s, m, r ) is a maximal edge and e is not maximal). Thus, by Lemma 4.5 and the fact that (Sz) N = λz N, we have Hence, z N+ = p p ((λ ( p ))z N p ( p )z N δ F γ G ). z N+ z ξ(x) = (λ ( p δ ))u F w γ F u δ F w γ F p u δ r= u δ r+ w γ r+ F p F p p w γ F r Since e is a maximal edge, it follows that s =. If r =, then δ = a and γ = b and if r = then δ = c and γ = d. Thus, z N+ z ξ(x) = r= u δ r+ w γ r+ λ ( p ) p λ ( p ) p ua F w b F u δ F w γ F p p p u δ F w γ F, if r =, uc F w d F u δ F w γ F p p p u δ F w γ F, if r =.
25 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 5 y (4.6), we deduce z N+ z ξ(x) = r= u δ r+ w γ r+ and so (4.7) z N+ = Since it follows that { ( u a+b F p p ) ( p w c+d F p p p z u δ + F w γ z u δ F w γ + F N = ((δ, γ )(δ, γ )...) = N + = and from (4.7), we have that z N+ = u N+ z. Finally we have to consider ζ N 3. ξ(x) r= u δ r+ w γ r+ ξ(x) F r= u δ r+ w γ r+ u δ F w γ F = u δ + F w γ F, if r =, ) u δ F w γ F = u δ F w γ + F, if r =,, if r =,, if r =. { ((a, b )(δ, γ )...), if r =, ((c, d )(δ, γ )...), if r =, { ((, )(δ +, γ )...), if r =, ((, )(δ, γ + )...), if r =, In this case, since (e,..., e ζn ) is a maximal element of E() E()... E(ζ N ) and d >, it follows that s j = r j = for all j {,..., ζ N }. Therefore, m j = c + d (i.e δ j = c and γ j = d ) for all j {,..., ζ N }. Furthermore, we have two subcases: () if r ζn = then m ζn = a + b (i.e δ ζn = a and γ ζn = b ), () if r ζn = then m ζn = c + d (i.e δ ζn = c and γ ζn = d ). Thus, by Lemma 4.5 and Denition 4.5, since (Sz) N = λz N, we have that (4.8) (4.9) [ ζn 3 [λ ( p )] [ ζn 3 ( p ) z N+ z ξ(x) r=ζ N u δ r+ w γ r+ = ] r= u c w d ] r= u c w d u δ ζ N F ζn wγ ζ N F ζn uδ ζ N F ζn wγ ζ N F ζn j= p j u δ ζ N F ζn wγ ζ N F ζn uδ ζ N F ζn wγ ζ N F ζn j= p j... ( p ζn )u δ ζ N F wγ ζ N ζn F uδ ζ N ζn F wγ ζ N ζn F ζn j=ζn p j y (4.6), the rst term in (4.9) is equal to p ζ N p ζn u δ ζn F ζn wγ ζ N F ζn.
26 6 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE w c+d F [ ζn ] 3 r= u c w d Summing with the the second term, we get [ ζn ] w c+d 3 F ( p ) r= u c w d p which is equal to w F [ ζn ] 3 r= u c w d u c F w d F [ ζn ] 3 r= u c w d u δ ζ N F wγ ζ N ζn F uδ ζ N ζn F ζn wγ ζ N F ζn j= p j. u δ ζ N F wγ ζ N ζn F uδ ζ N ζn F wγ ζ N ζn F ζn j=3 p j u δ ζ N F wγ ζ N ζn F uδ ζ N ζn F wγ ζ N ζn F ζn j=3 p j = u δ ζ N F wγ ζ N ζn F uδ ζ N ζn F wγ ζ N ζn F ζn j=3 p j. y induction, we have that the sum of the rst ζ N terms in (4.9) is equal to u δ ζ N F wγ ζ N + ζn F ζn u δ ζ N F ζn wγ ζ N F ζn p ζn. Finally, summing the previous expression with the last term in (4.9) we have that (4.8) is equal to Therefore, u a F ζn wb F ζn ( p ζ N ) p ζn u c F ζn wd F ζn ( p ζ N ) z N+ = p ζn { z u δ ζ N + F wγ ζ N ξ(x) ζn F ζn z u δ ζ N F ζn wγ ζ N + F ζn u δ ζn F ζn wγ ζ N F ζn = uδ ζ N + F ζn wγ ζ N F ζn, if r ζ N = ; u δ ζn F ζn wγ ζ N F ζn = uδ ζ N F ζn wγ ζ N + F ζn, if r ζ N =. r=ζ N u δ r+ w γ r+, if r ζn = ξ(x) r=ζ N u δ r+ w γ r+, if r ζn = = u N+z, where the next equality comes from the fact that δ i (N + ) = γ i (N + ) =, for all i {,..., ζ N }. Case d = : Suppose that r = and ζ N is an odd number (the proof for the cases r = or ζ N even can be dealt in the same way). Thus, since (e,..., e ζn ) is a maximal element of E() E()... E(ζ N ), we have that r i =, r i =, s i = and s i =, for all i {,..., ζ N }. Therefore, m i = a + b (i.e δ i = a and γ i = b ) and m i = c (i.e δ i = c and γ i = ) for all i {,..., ζ N }.
27 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 7 For each i {,..., ζ N }, let P i be the product dened by P i := ζ N r=i Thus, we have that P i = u a F i w b F i P i+ if i is an even number and P i = u c an odd number. y Lemma 4.5, since (Sv) N = λv N, we have that F i u δr+ w γ r+ P i+ if i is. (4.) (4.) v N+ v ξ(n+) r=ζ N u δ r+ w γ r+ = [λ ( p )]P u δ ζ N F wγ ζ N ζn F ζn ( p )P u δ ζ N F wγ ζ N ζn F ζn j= p ζn j j= p j ( p 3 )P u δ ζ N F wγ ζ N ζn F ζn ( p ζn )P ζn 3u δ ζ N... j=3 p ζn j j=ζn p j F wγ ζ N ζn F ζn ( p ζn )P ζn u δ ζ N j=ζn p j F wγ ζ N ζn F ζn p ζ N p ζn u δ ζn F ζn wγ ζ N F ζn. Since u F = w F = λ + p, the rst term in (4.) is equal to p u F P u δ ζ N F wγ ζ N ζn F ζn = j= p j u a+b F Summing with the the second term, we get u a+b F ( p ) p δ ζn P u F wγ ζ N ζn F ζn j=3 p j = u F Summing with the the third term, we get u c F ( p 3 ) p 3 δ ζn P u F wγ ζ N ζn F ζn j=4 p j = w F P u δ ζ N P u δ ζ N P u δ ζ N F wγ ζ N ζn F ζn j= p j. F wγ ζ N ζn F ζn j=3 p j = u c F F wγ ζ N ζn F ζn j=4 p j = u a F w b F P u δ ζ N F wγ ζ N ζn F ζn j=3 p j. P 3 u δ ζ N F wγ ζ N ζn F ζn j=4 p j. y induction we have that the sum of the rst ζ N terms in (4.) is equal to u a F ζn 3 wb F ζn 3 ( p ζ N ) p ζn P ζn u δ ζ N F wγ ζ N ζn F ζn j=ζn p j = u c F ζn u δ ζ N F ζn wγ ζ N F ζn p ζn. Finally, summing the previous expression with the last term in (4.) we have that (4.) is equal to u c F ζn ( p ζ N ) p ζn u δ ζn F ζn wγ ζ N F ζn = uδ ζ N F ζn wγ ζ N + F ζn.
28 8 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE Therefore, v N+ = v u δ ζ N F wγ ζ N + ξ(n+) ζn F ζn r=ζ N u δ r+ w γ r+ = u N+ v, where the last equality comes from the fact that δ i (N + ) = γ i (N + ) =, for all i {,..., ζ N }. Proposition 4.6. Let F := {λ C : (u Fn (λ)) n is bounded}. Then σ pt (S) F σ a (S). Proof. y Theorem 4.3, σ pt (S) F and we only have to prove that F σ a (S). Let λ F and suppose that λ / σ pt (S). We will prove that λ σ a (S). In fact, for each k, consider x (k) = (x (k), x (k), x (k),..., x (k) k,,,...) = (, u (λ), u (λ),..., u k (λ),,,...), where (u n (λ)) n = (u n ) n is the sequence dened in relation (4.5). Dene y (k) := x(k) x (k). Claim: lim (S n + λi)y(fn) =. In fact, for all i {,..., k }, we have ((S λi)y (k) ) i = and y i =, for all i > k. Hence, note that (S λi)y (k) + = sup (S λi) ij y (k) j i = sup k j= (S λi) ijx (k) j i k x (k). j= Let n >, k = F n and i k. We consider two cases: Case a > : If i = F n, then since a >, by relation (4.) we have that V Fn (x ) = (,...,, (,, ),,,...). Since n >, it follows that S }{{} i,j =, for all n j {,..., F n } and S i,i = p. Therefore, F n j= (S λi) ijx (Fn) j = p λ u Fn. If F n < i F n, then since a >, by the proof of Proposition 4.7, we have that V i (x ) = (e,..., e n, (,, ),,,...). Hence S i,j =, for all j {,..., F n }. Furthermore, since S i,i = p and S is a stochastic matrix, it follows that S i,j p, for j = F n. Therefore, F n j= (S λi) ijx (Fn) j p u Fn.
29 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 9 If i F n, then V i (x ) = (e,..., e l,,,...), with e l and l n+. Since a >, we have m l > and so S i,j =, for all j {,..., F n }. Furthermore, { p... p S i, = l ( p l+ ), if (e,..., e l ) is a maximal way; p, if is not.. Therefore, F n j= (S λi) ijx (Fn) j p x = p. Case a = : If i = F n then S i,j =, for all j {,..., F n }, and S i,i = p. Therefore, F n j= (S λi) ijx (Fn) j = p λ u Fn. If F n < i < F n +G n then S i,j =, for all j {,..., F n }, and S i,j p, for j = F n. Therefore, F n j= (S λi) ijx (Fn) j p u Fn. If i = F n +G n then S i,j =, for all j {,..., F n }, S i,j p for j = F n, S i, p if b = and S i, = if b >. Therefore, F n j= (S λi) ijx (Fn) j p + p u Fn. If i F n + G n then S i,j =, for all j {,..., F n }, and S i,j p, for j =. Therefore, F n j= (S λi) ijx (Fn) j p. Hence, from both cases it follows that (4.) (S λi)y (Fn) p λ u Fn + p u Fn + p x (Fn). Since λ F and λ / σ pt (S) it follows that (u Fn ) n is a bounded sequence and (u n ) n is not. Therefore, we have lim n + x (Fn) = +, which implies from relation (4.) that lim (S n + λi)y(fn) =. Therefore, λ σ a (S) σ(s). Proposition 4.7. If det M = ad bc, then (u Fn ) n is bounded if and only if (w Fn ) n is bounded. Proof. Let R n = p n+ u Fn + p n+ and S n = p n+ w Fn + p n+, for all n. y (4.5), we have that Rn+ c = Sn+w a bc ad F n and Sn+ b = Rn+u d bc ad F n. Since ad bc and (p n ) n is bounded, we obtain the result. Question. If det M >, is (u Fn ) n bounded equivalent (w Fn ) n bounded? Remark 4.8. From Remark 4.4 and Propositions 4.6 and 4.7, we have that if det M, then σ pt (S) E = {λ C : ( λ +p ) K} σ a (S). p, λ +p p
30 3 D. A. CAPRIO, A. MESSAOUDI AND G. VALLE Remark 4.9. If e := a + b = c + d, then we have F n = G n = e n, for all n. In this case, the Vershik map is related to addition of in base e, see Remark 3.3 and Example 3.3. For this class, it was proved in [9] that the ( point spectrum ) of S is equal to the bered lled Julia set of f n (x) = p n+ x a+b p n+. In the next Proposition we will prove the same result cited below. Proposition 4.. If a + b = c + d then σ pt (S) = E. Furthermore, ( ) E = {λ C : (f n... f (u F )) n is bounded}, where f n (x) = p n+ x a+b p n+, for all n. Proof. From Theorem 4.3 and Remark 4.4 we have that σ pt (S) E. Let λ E. Since a + b = c + d, it follows from (4.5) that u Fn (λ) = w Fn (λ), for all λ C and n. Thus, it follows that u Fn (λ), w Fn (λ) for all n, indeed let R > be a real number such that u Fk = w Fk > R. Since abc > and c + d >, it follows that min{a + b, c + d} >. Thus, u Fk+ = w Fk+ > p k+ R a+b p k+ p k+ > R a+b R. y induction we obtain that u Fk+i = w Fk+i > R i, for all i. Since R >, it follows that (u Fn ) n and (w Fn ) n are unbounded and λ / E which yields a contradiction. Therefore, if λ E and then u Fn (λ), w Fn (λ) for all n, by (4.5), we have that u n (λ), for all n, i.e. λ σ pt (S). To prove that E = {λ C : (f n... f (u F )) n is bounded}, we just need to observe that f n... f (u F (λ)) = u Fn (λ) = w Fn (λ), for all n. Remark 4.. In Proposition 4. we have proved that min{ u Fi, w Fi } > for some integer i, then min{ u Fn, w Fn } goes to when n goes to innity. Question. If det M, can we prove that σ pt (S) = E = σ(s)? Example 4.. Consider ( the ) consecutive ordering ratteli diagram represented 3 by the matrix M =. elow, we present some pictures describing the set E = {λ C : (u Fn (λ), w FN (λ)) n is bounded} for some choices of (p i ) i.
31 STOCHASTIC ADDING MACHINES ASED ON RATTELI DIAGRAMS 3 Example 4.3. Consider ( the ) consecutive ordering ratteli diagram represented by the matrix M =. elow, we present some pictures describing the set 3 E = F = {λ C : (u Fn (λ)) n is bounded} for some choices of (p i ) i. Example 4.4. Consider ( the ) consecutive ordering ratteli diagram represented 5 by the matrix M =. elow, we present some pictures describing the set 9 E = F = {λ C : (u Fn (λ)) n is bounded} for some choices of (p i ) i. Remark 4.. It will be interesting to compute the dierent parts of the spectrum of S acting on other anach spaces like c, c, l q, with q as done for base in [3] and for Cantor systems of numeration in [3] Some topological properties of the set E. Let us suppose for simplicity that p i = p ], [, for all i. Theorem 4.5. Assume that det M = ad bc < and bc > (ad bc), then the set E satises the following properties: () C \ E is a connected set. () If p <, then E is not connected.
Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More information8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains
8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States
More information2 Section 2 However, in order to apply the above idea, we will need to allow non standard intervals ('; ) in the proof. More precisely, ' and may gene
Introduction 1 A dierential intermediate value theorem by Joris van der Hoeven D pt. de Math matiques (B t. 425) Universit Paris-Sud 91405 Orsay Cedex France June 2000 Abstract Let T be the eld of grid-based
More information2. Transience and Recurrence
Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times
More informationMarkov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.
Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationCourse 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra
Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra D. R. Wilkins Contents 3 Topics in Commutative Algebra 2 3.1 Rings and Fields......................... 2 3.2 Ideals...............................
More informationMarkov Chains and Stochastic Sampling
Part I Markov Chains and Stochastic Sampling 1 Markov Chains and Random Walks on Graphs 1.1 Structure of Finite Markov Chains We shall only consider Markov chains with a finite, but usually very large,
More information= P{X 0. = i} (1) If the MC has stationary transition probabilities then, = i} = P{X n+1
Properties of Markov Chains and Evaluation of Steady State Transition Matrix P ss V. Krishnan - 3/9/2 Property 1 Let X be a Markov Chain (MC) where X {X n : n, 1, }. The state space is E {i, j, k, }. The
More information4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial
Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and
More informationClassification of Countable State Markov Chains
Classification of Countable State Markov Chains Friday, March 21, 2014 2:01 PM How can we determine whether a communication class in a countable state Markov chain is: transient null recurrent positive
More informationAnalysis on Graphs. Alexander Grigoryan Lecture Notes. University of Bielefeld, WS 2011/12
Analysis on Graphs Alexander Grigoryan Lecture Notes University of Bielefeld, WS 0/ Contents The Laplace operator on graphs 5. The notion of a graph............................. 5. Cayley graphs..................................
More informationSOME MEASURABILITY AND CONTINUITY PROPERTIES OF ARBITRARY REAL FUNCTIONS
LE MATEMATICHE Vol. LVII (2002) Fasc. I, pp. 6382 SOME MEASURABILITY AND CONTINUITY PROPERTIES OF ARBITRARY REAL FUNCTIONS VITTORINO PATA - ALFONSO VILLANI Given an arbitrary real function f, the set D
More informationLinear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space
Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................
More informationDetailed Proof of The PerronFrobenius Theorem
Detailed Proof of The PerronFrobenius Theorem Arseny M Shur Ural Federal University October 30, 2016 1 Introduction This famous theorem has numerous applications, but to apply it you should understand
More informationLecture 20 : Markov Chains
CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called
More informationLecture 7. µ(x)f(x). When µ is a probability measure, we say µ is a stationary distribution.
Lecture 7 1 Stationary measures of a Markov chain We now study the long time behavior of a Markov Chain: in particular, the existence and uniqueness of stationary measures, and the convergence of the distribution
More informationKernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman
Kernels of Directed Graph Laplacians J. S. Caughman and J.J.P. Veerman Department of Mathematics and Statistics Portland State University PO Box 751, Portland, OR 97207. caughman@pdx.edu, veerman@pdx.edu
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 2
MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 2. Countable Markov Chains I started Chapter 2 which talks about Markov chains with a countably infinite number of states. I did my favorite example which is on
More informationNotes on generating functions in automata theory
Notes on generating functions in automata theory Benjamin Steinberg December 5, 2009 Contents Introduction: Calculus can count 2 Formal power series 5 3 Rational power series 9 3. Rational power series
More informationSome Results Concerning Uniqueness of Triangle Sequences
Some Results Concerning Uniqueness of Triangle Sequences T. Cheslack-Postava A. Diesl M. Lepinski A. Schuyler August 12 1999 Abstract In this paper we will begin by reviewing the triangle iteration. We
More informationBoolean Inner-Product Spaces and Boolean Matrices
Boolean Inner-Product Spaces and Boolean Matrices Stan Gudder Department of Mathematics, University of Denver, Denver CO 80208 Frédéric Latrémolière Department of Mathematics, University of Denver, Denver
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More information5 Eigenvalues and Diagonalization
Linear Algebra (part 5): Eigenvalues and Diagonalization (by Evan Dummit, 27, v 5) Contents 5 Eigenvalues and Diagonalization 5 Eigenvalues, Eigenvectors, and The Characteristic Polynomial 5 Eigenvalues
More informationChapter 8. P-adic numbers. 8.1 Absolute values
Chapter 8 P-adic numbers Literature: N. Koblitz, p-adic Numbers, p-adic Analysis, and Zeta-Functions, 2nd edition, Graduate Texts in Mathematics 58, Springer Verlag 1984, corrected 2nd printing 1996, Chap.
More informationVector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)
Vector Space Basics (Remark: these notes are highly formal and may be a useful reference to some students however I am also posting Ray Heitmann's notes to Canvas for students interested in a direct computational
More information290 J.M. Carnicer, J.M. Pe~na basis (u 1 ; : : : ; u n ) consisting of minimally supported elements, yet also has a basis (v 1 ; : : : ; v n ) which f
Numer. Math. 67: 289{301 (1994) Numerische Mathematik c Springer-Verlag 1994 Electronic Edition Least supported bases and local linear independence J.M. Carnicer, J.M. Pe~na? Departamento de Matematica
More informationMathematics Course 111: Algebra I Part I: Algebraic Structures, Sets and Permutations
Mathematics Course 111: Algebra I Part I: Algebraic Structures, Sets and Permutations D. R. Wilkins Academic Year 1996-7 1 Number Systems and Matrix Algebra Integers The whole numbers 0, ±1, ±2, ±3, ±4,...
More informationLecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.
1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if
More informationAn Introduction to Entropy and Subshifts of. Finite Type
An Introduction to Entropy and Subshifts of Finite Type Abby Pekoske Department of Mathematics Oregon State University pekoskea@math.oregonstate.edu August 4, 2015 Abstract This work gives an overview
More informationP(X 0 = j 0,... X nk = j k )
Introduction to Probability Example Sheet 3 - Michaelmas 2006 Michael Tehranchi Problem. Let (X n ) n 0 be a homogeneous Markov chain on S with transition matrix P. Given a k N, let Z n = X kn. Prove that
More informationLecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011
Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding
More informationMath Homework 5 Solutions
Math 45 - Homework 5 Solutions. Exercise.3., textbook. The stochastic matrix for the gambler problem has the following form, where the states are ordered as (,, 4, 6, 8, ): P = The corresponding diagram
More informationRENEWAL THEORY STEVEN P. LALLEY UNIVERSITY OF CHICAGO. X i
RENEWAL THEORY STEVEN P. LALLEY UNIVERSITY OF CHICAGO 1. RENEWAL PROCESSES A renewal process is the increasing sequence of random nonnegative numbers S 0,S 1,S 2,... gotten by adding i.i.d. positive random
More informationAbstract. We show that a proper coloring of the diagram of an interval order I may require 1 +
Colorings of Diagrams of Interval Orders and -Sequences of Sets STEFAN FELSNER 1 and WILLIAM T. TROTTER 1 Fachbereich Mathemati, TU-Berlin, Strae des 17. Juni 135, 1000 Berlin 1, Germany, partially supported
More informationDiscrete Mathematics
Discrete Mathematics 311 (011) 1646 1657 Contents lists available at ScienceDirect Discrete Mathematics journal homepage: www.elsevier.com/locate/disc Point sets that minimize ( k)-edges, 3-decomposable
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationModeling and Stability Analysis of a Communication Network System
Modeling and Stability Analysis of a Communication Network System Zvi Retchkiman Königsberg Instituto Politecnico Nacional e-mail: mzvi@cic.ipn.mx Abstract In this work, the modeling and stability problem
More informationINTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING
INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING ERIC SHANG Abstract. This paper provides an introduction to Markov chains and their basic classifications and interesting properties. After establishing
More informationFUNCTORS AND ADJUNCTIONS. 1. Functors
FUNCTORS AND ADJUNCTIONS Abstract. Graphs, quivers, natural transformations, adjunctions, Galois connections, Galois theory. 1.1. Graph maps. 1. Functors 1.1.1. Quivers. Quivers generalize directed graphs,
More information8. Prime Factorization and Primary Decompositions
70 Andreas Gathmann 8. Prime Factorization and Primary Decompositions 13 When it comes to actual computations, Euclidean domains (or more generally principal ideal domains) are probably the nicest rings
More informationChapter 16 focused on decision making in the face of uncertainty about one future
9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account
More informationInterlude: Practice Final
8 POISSON PROCESS 08 Interlude: Practice Final This practice exam covers the material from the chapters 9 through 8. Give yourself 0 minutes to solve the six problems, which you may assume have equal point
More informationDefinitions, Theorems and Exercises. Abstract Algebra Math 332. Ethan D. Bloch
Definitions, Theorems and Exercises Abstract Algebra Math 332 Ethan D. Bloch December 26, 2013 ii Contents 1 Binary Operations 3 1.1 Binary Operations............................... 4 1.2 Isomorphic Binary
More informationMarkov Chains Handout for Stat 110
Markov Chains Handout for Stat 0 Prof. Joe Blitzstein (Harvard Statistics Department) Introduction Markov chains were first introduced in 906 by Andrey Markov, with the goal of showing that the Law of
More information1 Introduction We adopt the terminology of [1]. Let D be a digraph, consisting of a set V (D) of vertices and a set E(D) V (D) V (D) of edges. For a n
HIGHLY ARC-TRANSITIVE DIGRAPHS WITH NO HOMOMORPHISM ONTO Z Aleksander Malnic 1 Dragan Marusic 1 IMFM, Oddelek za matematiko IMFM, Oddelek za matematiko Univerza v Ljubljani Univerza v Ljubljani Jadranska
More informationWeek 9-10: Recurrence Relations and Generating Functions
Week 9-10: Recurrence Relations and Generating Functions April 3, 2017 1 Some number sequences An infinite sequence (or just a sequence for short is an ordered array a 0, a 1, a 2,..., a n,... of countably
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationN.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a
WHEN IS A MAP POISSON N.G.Bean, D.A.Green and P.G.Taylor Department of Applied Mathematics University of Adelaide Adelaide 55 Abstract In a recent paper, Olivier and Walrand (994) claimed that the departure
More informationA Z q -Fan theorem. 1 Introduction. Frédéric Meunier December 11, 2006
A Z q -Fan theorem Frédéric Meunier December 11, 2006 Abstract In 1952, Ky Fan proved a combinatorial theorem generalizing the Borsuk-Ulam theorem stating that there is no Z 2-equivariant map from the
More informationChapter 2: Markov Chains and Queues in Discrete Time
Chapter 2: Markov Chains and Queues in Discrete Time L. Breuer University of Kent 1 Definition Let X n with n N 0 denote random variables on a discrete space E. The sequence X = (X n : n N 0 ) is called
More information6 Orthogonal groups. O 2m 1 q. q 2i 1 q 2i. 1 i 1. 1 q 2i 2. O 2m q. q m m 1. 1 q 2i 1 i 1. 1 q 2i. i 1. 2 q 1 q i 1 q i 1. m 1.
6 Orthogonal groups We now turn to the orthogonal groups. These are more difficult, for two related reasons. First, it is not always true that the group of isometries with determinant 1 is equal to its
More informationMATH 326: RINGS AND MODULES STEFAN GILLE
MATH 326: RINGS AND MODULES STEFAN GILLE 1 2 STEFAN GILLE 1. Rings We recall first the definition of a group. 1.1. Definition. Let G be a non empty set. The set G is called a group if there is a map called
More informationare the q-versions of n, n! and . The falling factorial is (x) k = x(x 1)(x 2)... (x k + 1).
Lecture A jacques@ucsd.edu Notation: N, R, Z, F, C naturals, reals, integers, a field, complex numbers. p(n), S n,, b(n), s n, partition numbers, Stirling of the second ind, Bell numbers, Stirling of the
More informationTo appear in Monatsh. Math. WHEN IS THE UNION OF TWO UNIT INTERVALS A SELF-SIMILAR SET SATISFYING THE OPEN SET CONDITION? 1.
To appear in Monatsh. Math. WHEN IS THE UNION OF TWO UNIT INTERVALS A SELF-SIMILAR SET SATISFYING THE OPEN SET CONDITION? DE-JUN FENG, SU HUA, AND YUAN JI Abstract. Let U λ be the union of two unit intervals
More informationChapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries
Chapter 1 Measure Spaces 1.1 Algebras and σ algebras of sets 1.1.1 Notation and preliminaries We shall denote by X a nonempty set, by P(X) the set of all parts (i.e., subsets) of X, and by the empty set.
More informationMarkov Chains, Stochastic Processes, and Matrix Decompositions
Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral
More informationEIGENVALUES AND EIGENVECTORS 3
EIGENVALUES AND EIGENVECTORS 3 1. Motivation 1.1. Diagonal matrices. Perhaps the simplest type of linear transformations are those whose matrix is diagonal (in some basis). Consider for example the matrices
More informationNotes on the Matrix-Tree theorem and Cayley s tree enumerator
Notes on the Matrix-Tree theorem and Cayley s tree enumerator 1 Cayley s tree enumerator Recall that the degree of a vertex in a tree (or in any graph) is the number of edges emanating from it We will
More informationonly nite eigenvalues. This is an extension of earlier results from [2]. Then we concentrate on the Riccati equation appearing in H 2 and linear quadr
The discrete algebraic Riccati equation and linear matrix inequality nton. Stoorvogel y Department of Mathematics and Computing Science Eindhoven Univ. of Technology P.O. ox 53, 56 M Eindhoven The Netherlands
More informationMarkov Chains, Random Walks on Graphs, and the Laplacian
Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer
More information0 Sets and Induction. Sets
0 Sets and Induction Sets A set is an unordered collection of objects, called elements or members of the set. A set is said to contain its elements. We write a A to denote that a is an element of the set
More informationIntroduction and Preliminaries
Chapter 1 Introduction and Preliminaries This chapter serves two purposes. The first purpose is to prepare the readers for the more systematic development in later chapters of methods of real analysis
More informationAcyclic Digraphs arising from Complete Intersections
Acyclic Digraphs arising from Complete Intersections Walter D. Morris, Jr. George Mason University wmorris@gmu.edu July 8, 2016 Abstract We call a directed acyclic graph a CI-digraph if a certain affine
More informationMS 3011 Exercises. December 11, 2013
MS 3011 Exercises December 11, 2013 The exercises are divided into (A) easy (B) medium and (C) hard. If you are particularly interested I also have some projects at the end which will deepen your understanding
More informationLIMITING CASES OF BOARDMAN S FIVE HALVES THEOREM
Proceedings of the Edinburgh Mathematical Society Submitted Paper Paper 14 June 2011 LIMITING CASES OF BOARDMAN S FIVE HALVES THEOREM MICHAEL C. CRABB AND PEDRO L. Q. PERGHER Institute of Mathematics,
More informationVector bundles in Algebraic Geometry Enrique Arrondo. 1. The notion of vector bundle
Vector bundles in Algebraic Geometry Enrique Arrondo Notes(* prepared for the First Summer School on Complex Geometry (Villarrica, Chile 7-9 December 2010 1 The notion of vector bundle In affine geometry,
More informationDiscrete time Markov chains. Discrete Time Markov Chains, Limiting. Limiting Distribution and Classification. Regular Transition Probability Matrices
Discrete time Markov chains Discrete Time Markov Chains, Limiting Distribution and Classification DTU Informatics 02407 Stochastic Processes 3, September 9 207 Today: Discrete time Markov chains - invariant
More informationAnother algorithm for nonnegative matrices
Linear Algebra and its Applications 365 (2003) 3 12 www.elsevier.com/locate/laa Another algorithm for nonnegative matrices Manfred J. Bauch University of Bayreuth, Institute of Mathematics, D-95440 Bayreuth,
More informationChapter 7. Markov chain background. 7.1 Finite state space
Chapter 7 Markov chain background A stochastic process is a family of random variables {X t } indexed by a varaible t which we will think of as time. Time can be discrete or continuous. We will only consider
More informationMath 455 Some notes on Cardinality and Transfinite Induction
Math 455 Some notes on Cardinality and Transfinite Induction (David Ross, UH-Manoa Dept. of Mathematics) 1 Cardinality Recall the following notions: function, relation, one-to-one, onto, on-to-one correspondence,
More informationMARKOV CHAINS: STATIONARY DISTRIBUTIONS AND FUNCTIONS ON STATE SPACES. Contents
MARKOV CHAINS: STATIONARY DISTRIBUTIONS AND FUNCTIONS ON STATE SPACES JAMES READY Abstract. In this paper, we rst introduce the concepts of Markov Chains and their stationary distributions. We then discuss
More informationELEMENTARY LINEAR ALGEBRA
ELEMENTARY LINEAR ALGEBRA K R MATTHEWS DEPARTMENT OF MATHEMATICS UNIVERSITY OF QUEENSLAND First Printing, 99 Chapter LINEAR EQUATIONS Introduction to linear equations A linear equation in n unknowns x,
More informationA matrix over a field F is a rectangular array of elements from F. The symbol
Chapter MATRICES Matrix arithmetic A matrix over a field F is a rectangular array of elements from F The symbol M m n (F ) denotes the collection of all m n matrices over F Matrices will usually be denoted
More information4.4 Noetherian Rings
4.4 Noetherian Rings Recall that a ring A is Noetherian if it satisfies the following three equivalent conditions: (1) Every nonempty set of ideals of A has a maximal element (the maximal condition); (2)
More information2 G. D. DASKALOPOULOS AND R. A. WENTWORTH general, is not true. Thus, unlike the case of divisors, there are situations where k?1 0 and W k?1 = ;. r;d
ON THE BRILL-NOETHER PROBLEM FOR VECTOR BUNDLES GEORGIOS D. DASKALOPOULOS AND RICHARD A. WENTWORTH Abstract. On an arbitrary compact Riemann surface, necessary and sucient conditions are found for the
More informationNotes on Measure, Probability and Stochastic Processes. João Lopes Dias
Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt
More informationClassification of root systems
Classification of root systems September 8, 2017 1 Introduction These notes are an approximate outline of some of the material to be covered on Thursday, April 9; Tuesday, April 14; and Thursday, April
More informationReid 5.2. Describe the irreducible components of V (J) for J = (y 2 x 4, x 2 2x 3 x 2 y + 2xy + y 2 y) in k[x, y, z]. Here k is algebraically closed.
Reid 5.2. Describe the irreducible components of V (J) for J = (y 2 x 4, x 2 2x 3 x 2 y + 2xy + y 2 y) in k[x, y, z]. Here k is algebraically closed. Answer: Note that the first generator factors as (y
More informationLinear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.
Linear Algebra M1 - FIB Contents: 5 Matrices, systems of linear equations and determinants 6 Vector space 7 Linear maps 8 Diagonalization Anna de Mier Montserrat Maureso Dept Matemàtica Aplicada II Translation:
More informationP i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=
2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]
More informationMarkov Chains (Part 3)
Markov Chains (Part 3) State Classification Markov Chains - State Classification Accessibility State j is accessible from state i if p ij (n) > for some n>=, meaning that starting at state i, there is
More informationFree Subgroups of the Fundamental Group of the Hawaiian Earring
Journal of Algebra 219, 598 605 (1999) Article ID jabr.1999.7912, available online at http://www.idealibrary.com on Free Subgroups of the Fundamental Group of the Hawaiian Earring Katsuya Eda School of
More informationGEOMETRIC CONSTRUCTIONS AND ALGEBRAIC FIELD EXTENSIONS
GEOMETRIC CONSTRUCTIONS AND ALGEBRAIC FIELD EXTENSIONS JENNY WANG Abstract. In this paper, we study field extensions obtained by polynomial rings and maximal ideals in order to determine whether solutions
More informationThe Distribution of Mixing Times in Markov Chains
The Distribution of Mixing Times in Markov Chains Jeffrey J. Hunter School of Computing & Mathematical Sciences, Auckland University of Technology, Auckland, New Zealand December 2010 Abstract The distribution
More information4 Sums of Independent Random Variables
4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables
More information1 Directional Derivatives and Differentiability
Wednesday, January 18, 2012 1 Directional Derivatives and Differentiability Let E R N, let f : E R and let x 0 E. Given a direction v R N, let L be the line through x 0 in the direction v, that is, L :=
More informationArithmetic properties of the adjacency matrix of quadriculated disks
Arithmetic properties of the adjacency matrix of quadriculated disks arxiv:math/00762v2 [mathco] 3 Aug 2003 Nicolau C Saldanha and Carlos Tomei December 22, 203 Abstract Let be a bicolored quadriculated
More informationCLASSES OF STRICTLY SINGULAR OPERATORS AND THEIR PRODUCTS
CLASSES OF STRICTLY SINGULAR OPERATORS AND THEIR PRODUCTS GEORGE ANDROULAKIS, PANDELIS DODOS, GLEB SIROTKIN AND VLADIMIR G. TROITSKY Abstract. Milman proved in [18] that the product of two strictly singular
More informationX. Hu, R. Shonkwiler, and M.C. Spruill. School of Mathematics. Georgia Institute of Technology. Atlanta, GA 30332
Approximate Speedup by Independent Identical Processing. Hu, R. Shonkwiler, and M.C. Spruill School of Mathematics Georgia Institute of Technology Atlanta, GA 30332 Running head: Parallel iip Methods Mail
More informationThe Chromatic Number of Ordered Graphs With Constrained Conflict Graphs
The Chromatic Number of Ordered Graphs With Constrained Conflict Graphs Maria Axenovich and Jonathan Rollin and Torsten Ueckerdt September 3, 016 Abstract An ordered graph G is a graph whose vertex set
More informationPartition of Integers into Distinct Summands with Upper Bounds. Partition of Integers into Even Summands. An Example
Partition of Integers into Even Summands We ask for the number of partitions of m Z + into positive even integers The desired number is the coefficient of x m in + x + x 4 + ) + x 4 + x 8 + ) + x 6 + x
More informationTopological K-equivalence of analytic function-germs
Cent. Eur. J. Math. 8(2) 2010 338-345 DOI: 10.2478/s11533-010-0013-8 Central European Journal of Mathematics Topological K-equivalence of analytic function-germs Research Article Sérgio Alvarez 1, Lev
More informationMath 324 Summer 2012 Elementary Number Theory Notes on Mathematical Induction
Math 4 Summer 01 Elementary Number Theory Notes on Mathematical Induction Principle of Mathematical Induction Recall the following axiom for the set of integers. Well-Ordering Axiom for the Integers If
More informationLecture 9 Classification of States
Lecture 9: Classification of States of 27 Course: M32K Intro to Stochastic Processes Term: Fall 204 Instructor: Gordan Zitkovic Lecture 9 Classification of States There will be a lot of definitions and
More informationNotes on the matrix exponential
Notes on the matrix exponential Erik Wahlén erik.wahlen@math.lu.se February 14, 212 1 Introduction The purpose of these notes is to describe how one can compute the matrix exponential e A when A is not
More informationLIMITING PROBABILITY TRANSITION MATRIX OF A CONDENSED FIBONACCI TREE
International Journal of Applied Mathematics Volume 31 No. 18, 41-49 ISSN: 1311-178 (printed version); ISSN: 1314-86 (on-line version) doi: http://dx.doi.org/1.173/ijam.v31i.6 LIMITING PROBABILITY TRANSITION
More informationG METHOD IN ACTION: FROM EXACT SAMPLING TO APPROXIMATE ONE
G METHOD IN ACTION: FROM EXACT SAMPLING TO APPROXIMATE ONE UDREA PÄUN Communicated by Marius Iosifescu The main contribution of this work is the unication, by G method using Markov chains, therefore, a
More informationStatistics 150: Spring 2007
Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities
More information