COMBINATORIAL LÉVY PROCESSES

Size: px
Start display at page:

Download "COMBINATORIAL LÉVY PROCESSES"

Transcription

1 COMBINATORIAL LÉVY PROCESSES HARRY CRANE Abstract. Combinatorial Lévy processes evolve on general state spaces of countable combinatorial structures. In this setting, the usual Lévy process properties of stationary, independent increments are defined in an unconventional way in terms of the symmetric difference operation on sets. In discrete time, the description of combinatorial Lévy processes gives rise to the notion of combinatorial random walk. These processes behave differently than random walks and Lévy processes on other state spaces. Standard examples include processes on subsets of a countable set, graphs with countably many vertices, and n-ary relations, but the framework permits far more general possibilities. The main theorems characterize combinatorial Lévy processes by a unique σ-finite measure. Under the additional assumption of exchangeability, we obtain an explicit Lévy Itô Khintchine-type characterization, by which every exchangeable combinatorial Lévy process corresponds to a Poisson point process on the same state space.. Introduction A Lévy process on R d is a random path t X t with stationary, independent increments and càdlàg sample paths with respect to the Euclidean topology. Lévy processes comprise a large class of tractable models with applications in finance, neuroscience, climate modeling, etc., and the Lévy Itô Khintchine theorem decomposes their rich structure into an independent Brownian motion with drift, a compound Poisson process, and a pure jump martingale. Properties of R-valued Lévy processes specialize those of Lévy processes in general topological groups; see Bertoin [8] for a survey of the real-valued setting. In an arbitrary topological group X, the Lévy process assumptions are defined with respect to the group action, with the left, respectively right, increment between x, x X defined as the unique y X such that x = yx, respectively x = x y. Liao [28] gives a general introduction to Lévy processes in topological groups with special treatment of the Lie group case, which garners special interest for its relation to certain types of stochastic flows. In both the real-valued and Lie group setting, many nice properties result from the interplay between the increments assumptions and the topology of the underlying state space. In Euclidean space, the Lévy Itô Khintchine representation is tied to its predecessor, the Lévy Khintchine theorem for infinitely divisible distributions. In a Lie group, the smoothness of the associated Lie algebra plays a key role. Afield of Lévy processes, combinatorial stochastic processes evolve on discrete state spaces, with a focus on the theory of exchangeable random partitions [5, 9, 26], coalescent and fragmentation processes [7, 9, 27, 3], connections to stable subordinators, Brownian bridges, and Lévy processes [30, 33], tree- [, 2, 3, 2, 8, 32] and graph-valued [4, 6] Date: March 8, Mathematics Subject Classification. 60J25; 60G09; 60B5. Key words and phrases. combinatorial stochastic process; Lévy process; dynamic networks; Lévy Itô Khintchine representation; exchangeability.

2 2 HARRY CRANE processes. In applications, these processes serve as models for dynamic structures that arise in streaming data collection and complex network applications. Adding a temporal component to the already complicated structural features of combinatorial objects introduces a potentially intractable amount of flexibility that invites arbitrary dynamics and abstruse descriptions. We introduce combinatorial Lévy processes as a reasonable family of models for this purpose. Combinatorial Lévy processes evolve on discrete spaces of labeled combinatorial objects. Rather than restrict attention to any specific state space, we develop a theory that encompasses special cases as well as more general processes. The following special cases motivate our general treatment. Set-valued processes: On the space of subsets of N, a combinatorial Lévy process evolves by rearranging elements. For example, each element i =, 2,... might enter and leave the set at alternating times of independent rate- Poisson point processes. These dynamics imitate those of some previously studied partitionvalued processes [0, 3, 30]. Forty years ago, Harris [2, 22] studied set-valued processes under entirely different assumptions. Graph-valued processes: Perhaps the most immediate contemporary interest in combinatorial Lévy processes is in modeling dynamic networks. Even given the recent interest in complex networks, stochastic process models for dynamic networks have received little attention: we know of only [23, 25, 34] in the statistics literature, [4, 6] in the probability literature, and [20] and some follow up articles in the epidemiology and physics literature. Our main discussion explicitly describes the possibilities and limitations of combinatorial Lévy process models for dynamic networks. Our main theorem stratifies the behavior of graph-valued Lévy processes into a hierarchy of global, vertex, and edge-level discontinuities in a parallel manner to the Aldous Hoover decomposition of exchangeable random graphs [4, 24]. Networks with community structure: Another interesting place for this theory is in modeling composite structures, such as dynamic networks with an underlying community structure. In this case, it is natural to combine the above two processes on sets and graphs into a process that models the joint evolution of a network and a community of its vertices. Extensions to collections of k different communities and l different networks and possibly higher order interactions also fall within the scope of combinatorial Lévy processes... Outline. In Section 2, we summarize the main theorems in the case of set-valued Lévy processes. In Section 3, we lay down key definitions, notation, and observations. In Section 4, we formally summarize the main theorems in the language of Section 3. In Section 5, we demonstrate our main theorems with concrete examples that are relevant to specific applications. In Section 6, we prove a key theorem about σ-finite measures on combinatorial spaces, from which we readily deduce the Lévy Itô Khintchine representation for exchangeable combinatorial Lévy processes. In Section 7, we prove our main theorems. In Section 8, we make concluding remarks. 2. Exposition: set-valued processes Remark 2. (Notation). We discuss both discrete and continuous time processes. When speaking generally, we index time by t [0, ). When speaking specifically about discrete time processes, we index time by m Z + := {0,,...} and write X = (X m, m 0).

3 COMBINATORIAL LÉVY PROCESSES 3 We introduce the concept of combinatorial increments to capture structural differences between combinatorial objects. To fix ideas, we first assume X = (X t, t 0) evolves on the space of subsets of a base set S N, denoted 2 S. 2.. Increments and topology. Every A N determines a map A : 2 N 2 N by A A A, where () A A := (A A c ) (A c A ) is the symmetric difference operation and A c := N \A denotes the complement of A. Under this operation, the empty set := {} acts as the identity and each A N is its own inverse, that is, A A = for all A N. We equip 2 N with the product discrete topology induced by (2) d(a, A ) := /( + sup{n N : A [n] = A [n]}), A, A N, with the convention / = 0. In the following definition, T stands for either discrete time (T = Z + ) or continuous time (T = [0, )). The definition holds in either case, the only difference being that càdlàg paths are automatic in discrete time. Definition 2.2 (Combinatorial Lévy process on 2 N ). We call X = (X t, t T) a combinatorial Lévy process on 2 N if it has X 0 =, stationary increments, that is, X t+s X s = D X t for all s, t 0, where = D denotes equality in law, independent increments, that is, X t X t0,..., X tk X tk are independent for all 0 t 0 t t k < in T, and càdlàg sample paths, that is, t X t is right continuous and has left limits under the topology induced by (2). We can interpret discrete time combinatorial Lévy processes on 2 N as set-valued random walks. Definition 2.3 (Set-valued random walk). A random walk on 2 N with increment distribution µ on L N and initial state X 0 is a discrete time process X = (X m, m 0) with X m+ = D X m m+ for every m 0, where, 2,... are independent and identically distributed (i.i.d.) according to µ. Theorem 2.4. Let X = (X m, m 0) be a discrete time combinatorial Lévy process on 2 N. Then there exists a unique probability measure µ on 2 N such that X is distributed as a random walk with initial state and increment distribution µ. The proof of Theorem 2.4 is straightforward even for general combinatorial Lévy processes see Theorem 4.4 but we explicitly prove the set-valued case to aid our more general discussion later on. Proof. The stationary and independent increments assumptions imply that X = (X m, m 0) is determined by its initial state X 0 = and an independent, identically distributed sequence = ( m, m ) of subsets, where m = X m X m, m. For each m, m contains all elements whose status changes between times m and m; thus, the transition law of X is governed by a unique probability measure µ on 2 N, which acts as the increments measure for the random walk started at.

4 4 HARRY CRANE In continuous time, X = (X t, t 0) can experience infinitely many jumps in bounded time intervals, but càdlàg sample paths constrain each induced finite state space process X [n] := (X t [n], t 0) to jump only finitely often in bounded intervals. These competing notions harness the behavior of X and lead to the following general characterization. Theorem 2.5. Let X = (X t, t 0) be a continuous time combinatorial Lévy process on 2 N. Then there is a unique measure µ on 2 N satisfying (3) µ({ }) = 0 and µ({a 2 N : A [n] }) < for all n N such that the infinitesimal jump rates of X satisfy lim t 0 t P{X t d } = µ(d ), 2 N \ { }. We observe that any combinatorial Lévy process has the Feller property (Corollary 4.7) and, thus, its evolution is determined by the infinitesimal jump rate µ(d ) = lim t 0 t P{X t d }, 2 N \ { }. Since = corresponds to no jump, we may tacitly assume µ({ }) = 0. To ensure that each X [n] jumps only finitely often, µ must also satisfy µ({a 2 N : A [n] }) < for all n N. Since the behavior of X is determined by the infinitesimal jump rates lim t 0 t P{X t d }, X can be described by a unique measure µ on 2 N that satisfies (3). From any µ satisfying (3), we construct the µ-canonical Lévy process X µ = (Xt, t 0) from a Poisson point process = {(t, t )} [0, ) 2 N with intensity measure dt µ, where dt denotes Lebesgue measure on [0, ). The atoms of determine the jumps of X µ and the law of X µ coincides with the law of X through the following explicit construction. Given and n N, we construct X [n] µ = (X [n], t 0) on 2 [n] by t X [n] =, 0 X [n] t = X [n] t ( t [n]), if (t, t ) is an atom of, and X [n] t = X [n] t := lim s t Xs [n] if t > 0, is not an atom time in, that is, X [n] is constant between the atom times of. Every combinatorial Lévy process admits a canonical version and the spirit of the Lévy Itô Khintchine theorem lives on; but, rather than the three part decomposition of Lévy Itô Khintchine, we observe a correspondence between combinatorial Lévy processes on 2 N and Poisson point processes with intensity dt µ for µ satisfying (3). Theorem 4.5 covers the corresponding description of general combinatorial Lévy processes Exchangeable processes. For processes X = (X t, t 0) and X = (X t, t 0), we write X = D X to denote that X and X have the same finite-dimensional distributions, that is, (X t,..., X tr ) = D (X t,..., X t r ) for all 0 < t < < t r <. For A N and any permutation σ : N N, we denote the relabeling of A by σ by A σ, where i A σ if and only if σ(i) A.

5 COMBINATORIAL LÉVY PROCESSES 5 We call X exchangeable if X = D X σ = (Xt σ, t 0) for all permutations σ : N N that fix all but finitely many elements of N. By Theorem 2.4, the discrete time increments of X = (X m, m 0) are independent and identically distributed from a probability measure µ on 2 N. Under the additional assumption that X is exchangeable, µ must also be exchangeable in the sense that µ({a N : A [n] = A}) = µ({a N : A [n] = A σ }), A [n], for all permutations σ : [n] [n], for all n N. Any probability measure ν on [0, ] induces an exchangeable measure ν on 2 N by (4) ν ({A 2 N : A [n] = A}) = p A ( p) n A ν(dp), A [n], n N, [0,] where A denotes the cardinality of A [n]. de Finetti s theorem [7] gives the converse: every exchangeable probability measure µ corresponds to a unique probability measure ν on [0, ] so that µ = ν, that is, µ is the ν-mixture defined in (4). In continuous time, µ in (3) decomposes into mutually singular pieces, invoking a Lévy Itô-type interpretation. Theorem 2.6. Let X = (X t, t 0) be an exchangeable combinatorial Lévy process on 2 N. Then there exists a unique measure ν on [0, ] satisfying (5) ν({0}) = 0 and s ν(ds) < [0,] and a unique constant c 0 such that X = D X µ, the µ-canonical Lévy process defined above with (6) µ = ν + c ɛ i, where ν is defined as in (4), with ν now possibly an infinite measure, and ɛ i is the unit mass at {i} N for each i N. i= We call (6) the Lévy Itô Khintchine representation. statement. See Theorem 4.4 for the general 2.3. Projecting into [0, ]. We can project X = (X m, m 0) into [0, ] by X m π(x m ), where (7) π(x m ) := lim n n X m [n] is the limiting frequency of elements in X m. By de Finetti s theorem and the law of large numbers, π(x) := (π(x m ), m 0) exists almost surely whenever X is exchangeable. Furthermore, by independence of X m and m, we observe π(x m ) = D π(x m )( π( m )) + ( π(x m ))π( m ), so that π(x) is also a Markov chain on [0, ]. In continuous time, the projected process ((π(x t ), π(x t )), t 0) exists almost surely and exhibits the Feller property in the Euclidean topology on the -dimensional simplex.

6 6 HARRY CRANE 2.4. Extending the set-valued case. When moving beyond the set-valued case, the projection operation π : 2 N [0, ] must be replaced by the more technically involved notion of a combinatorial limit, which maps an object A to an exchangeable probability measure A on the space inhabited by A. In the case where A N, we define A as follows. For any injection ϕ : [m] N and A N, we define A ϕ [m] by i A ϕ if and only if ϕ(i) A. For any finite subset S [m], we define the limiting density of S in A by δ(s; A) := lim {A ϕ = S}, if it exists, n n m injections ϕ:[m] [n] where n m := n(n ) (n m+) and { } is the indicator function of the event described by. (As we discuss later, existence of δ(s; A) is guaranteed whenever A is the realization of an exchangeable random set, and so we do not worry about existence for now.) Together the collection (δ(s; A), S m N 2 [m] ) determines a unique, exchangeable probability measure µ on 2 N with µ({a 2 N : A [m] = S}) = δ(s; A), S [m]. We denote this probability measure by A. In the set-valued case, A and π(a) encode the same probability measure by noting that π(a) = p implies A ({A 2 N : A [m] = S}) = p S ( p) m S, S [m]. This equivalence is not obvious, but it follows directly from de Finetti s theorem. There is no such simplification for general structures, and so we must resort to the more technical definition of A in terms of the limiting densities δ(s; A), which we introduce formally in Section 3.2. Our main theorems lift the foregoing ideas for set-valued processes to Lévy processes on countable combinatorial objects, which no longer have the simple -dimensional structure of subsets and, thus, require more care. The upshot of our general treatment is an overarching theory for modeling dynamic combinatorial structures. A potential liability of this general framework is that some readers may lose track of the intuition that the main theorems provide in special cases. To avoid these pitfalls, we frame our main theorems in the context of the more tangible cases of set- and graph-valued processes, and we continually revisit these examples throughout. 3. Combinatorial structures Remark 3. (Notation). We employ the usual notation (x,..., x n ) and {x,..., x n } to denote ordered and unordered sets, respectively. The above examples are special cases of what we call combinatorial structures. Definition 3.2 (Combinatorial structures). A signature L is a finite list (i,..., i k ) of nonnegative integers for which 0 i i k and i k. Given a signature L = (i,..., i k ) and a set S, a combinatorial structure with signature L over S is a collection M = (S; M,..., M k ) such that M j S i j for every j =,..., k, with the convention S 0 := { } for the S-valued vector of length 0. We alternatively call M an L-structure or simply a structure when the signature L

7 COMBINATORIAL LÉVY PROCESSES 7 is understood. We write L S to denote the set of L-structures over S. We call i j the arity of M j for each j =,..., k. Remark 3.3 (Components with arity 0). We exclude the case i = = i k = 0 from Definition 3.2 for technical reasons. By the convention S 0 = { }, the space L S of structures with signature L = (0) consists of the two elements M = (S; ) and M = (S; { }). Therefore, although the case i = = i k = 0 is not particularly interesting when k =, it is still a nontrivial state space on which to define a process. For k >, the structure M = (S; M,..., M k ) with signature (0,..., 0) corresponds to an element in the hypercube, which is of interest in various applications including the design of experiments. Example 3.4 (Common examples). In terms of Definition 3.2, a subset A N is a combinatorial structure with L = (), that is, A N corresponds to (N; A). A directed graph G with vertex set N and edge set E N N is a structure with L = (2), that is, G = (N; E). (Our definition here permits self-loops in G.) Taking L = (, 2), we obtain M = (N; A, E), which corresponds to a graph (N; E) and a designated subset, or community, of vertices A N. For L = (, 2, 3), M = (N; A, A 2, A 3 ) represents first-, second-, and third-order interactions among a collection of particles or among statistical units in a designed experiment. The act of subsampling S S induces a natural restriction operation L S L S by M M S, where (8) M S := (S ; M S i,..., M k S i k ). Any permutation σ : S S induces a relabeling operation L S L S by M M σ, where (9) M σ := (S; M σ,..., Mσ k ) is defined by (a,..., a ij ) M σ j if and only if (σ(a ),..., σ(a ij )) M j for each j =,..., k. Combining (8) and (9), we define the image of M L S by any injection ϕ : S S as M ϕ = (S ; M ϕ,..., Mϕ k ) L S, where (0) (a,..., a ij ) M ϕ j if and only if (ϕ(a ),..., ϕ(a ij )) M j for each j =,..., k. Under these operations, the space L N of countable combinatorial structures comes furnished with the product discrete topology induced by the ultrametric () d(m, M ) := /( + sup{n N : M [n] = M [n] }), M, M L N, with the convention / = 0. Under (), (L N, d) is a compact, separable, and Polish metric space, which we equip with the Borel σ-algebra. 3.. Combinatorial increments. For any S N and M = (S; M,..., M k ) L S, we write {, a Mj, M j (a) = {a M j } := 0, otherwise, for each a = (a,..., a ij ) S i j, j =,..., k. We then define the increment between M and M in L S by M M = (M, M ) := (S;,..., k ), where (2) a j if and only if M j (a) M j (a), for each a = (a,..., a ij ) S i j, j =,..., k. For example, when L = (), M M is the symmetric difference between subsets of N as in (); when L = (2), M M is the directed

8 8 HARRY CRANE graph whose edges are the pairs (i, j) at which M and M differ; and so on. Importantly, the increment between any two L-structures is also an L-structure with the same base set. The spaces of L-structures we consider can be regarded as a group (L N, ), where the group action is defined by the increment operation above. In particular, every M L S acts on L S by M M M. Defined in this way, (L S, ) is a transitive, abelian group with identity given by the empty structure 0 L S := (S;,..., ) and for which every element M L S is its own inverse. The group structure of L S enriches the product discrete topology induced by () and underlies several key properties of combinatorial Lévy processes. Furthermore, L S is partially ordered and has minimum element 0 L under pointwise inclusion, that is, S M M if and only if M j (a) M j (a) for every a Si j, for all j =,..., k Exchangeability and combinatorial limits. de Finetti s theorem, the Aldous Hoover theorem, and their relatives permit the study of exchangeable sequences, graphs, and {0, }- valued arrays by projecting into a continuous limit space, for example, the unit interval, the space of graph limits, and the space of hypergraph limits, respectively. More generally, exchangeable combinatorial L-structures permit an analogous representation in a space of combinatorial limits. The example in Section 2 shows that much of the structural behavior of an exchangeable set-valued Lévy process is determined by its projection into the unit interval. Our main theorems extend this idea to characterize exchangeable combinatorial Lévy processes through their induced behavior in the appropriate limit space. As mentioned in Section 2, the combinatorial limit of M L N is not as simple as the projection of A N to its limiting frequency π(a) as in (7). To see why, consider A, A N and let M = (N; A, A ) be the associated (, )-structure. Although M is just a pair of subsets, the individual frequencies π(a) and π(a ) are not sufficient to summarize the full structure of M: if we construct A by including each element i N independently with probability p (0, ) and we define A = A, then π(a) = π(a ) = p with probability ; but if we define A and A as independent and identically distributed so that each element has probability p (0, ) of appearing in A, respectively A, then π(a) = π(a ) = p with probability, but P{A = A } = 0. In both cases, (π(a), π(a )) = (p, p), but the structure of M = (N; A, A ) is vastly different in the two constructions. The pair (π(a), π(a )) does not capture all structural features of (N; A, A ), motivating the following definition. Definition 3.5 (Homomorphism density). For any signature L and finite subsets S S N, we define the homomorphism density of A L S in M L S by (3) δ(a; M) := {M ϕ = A}, S S ϕ:s S where the sum is over injections ϕ : S S, S denotes the cardinality of S N, and n m := n(n ) (n m + ) is the falling factorial function. For brevity, we refer to (3) as the density of A in M. Intuitively, δ(a; M) is the probability that M ϕ = A for ϕ chosen uniformly at random among all injections S S. For fixed M L S, the density function δ( ; M) determines a probability measure on L S for every S S. For M L N and A L [m], we define the limiting density of A in M by (4) δ(a; M) := lim δ(a; M [n] ), if it exists. n Provided each of the limits δ(a; M), A L [n], exists, the collection of homomorphism densities (δ(a; M), A L [n] ) determines a probability measure on L [n] by the bounded

9 COMBINATORIAL LÉVY PROCESSES 9 convergence theorem. If (4) exists for every A n N L [n], then the family of distributions defined by (δ(a; M), A L [n] ) for each n N determines a unique probability measure on L N, which we denote by M. Definition 3.6 (Combinatorial limit). We define the combinatorial limit M of M L N as the unique probability measure µ on L N such that (5) µ({a L N : A [m] = A}) = δ(a; M), A L [m], m N, provided the limit δ(a; M) exists for every A m N L [m]. For brevity, we write M (A) := M ({A L N : A [m] = A}) for each A L [m], m N. Lovász and Szegedy [29] defined the concept of a graph limit in terms of the limiting homomorphism densities of all finite subgraphs within a sequence of graphs. Definition 3.6 extends the Lovász Szegedy notion to the more general setting of combinatorial structures from Definition 3.2. The space of exchangeable, dissociated probability measures plays a fundamental role in the study of exchangeable structures. Definition 3.7 (Exchangeable and dissociated L-structures). For any S N, a random structure M = (S; M,..., M k ) is exchangeable if M σ = D M for all permutations σ : S S that fix all but finitely many elements of S. We call M L S dissociated if M T and M T are independent whenever T, T S are disjoint. When A N is a random subset, exchangeable and dissociated corresponds to independent and identically distributed, which explains why the projection π(a) into [0, ] in (7) is enough to determine the combinatorial limit M of a ()-structure M = (N; A); see Equation (7) and the discussion at the end of Section 2. For more complex structures, dissociation still allows dependence between certain parts of the structure. In Proposition 6.2, we prove that the combinatorial limit of any exchangeable L-structure exists with probability. Definition 3.8 (Combinatorial limit space). For any signature L, we write E L to denote the space of exchangeable, dissociated probability measures on L N. As every W E L is a probability measure on L N, we write W(A), A L [n], as shorthand for W(A) := W({M L N : M [n] = A}), A L [n]. We then define the distance between W, W E L by (6) d(w, W ) = 2 n W(A) W (A). n N A L [n] We equip E L with the Borel σ-algebra induced by this metric. The Borel σ-algebra is the smallest σ-algebra such that : L N E L { } is measurable, where we define M = whenever M does not exist. 4. Summary of main theorems 4.. General combinatorial Lévy processes. Recall the definition of the increment : L S L S L S in (2) and 0 L = (S;,..., ). S Definition 4. (Combinatorial Lévy process). For any signature L and S N, we call X = (X t, t 0) on L S a combinatorial Lévy process if it has X 0 = 0 L S,

10 0 HARRY CRANE stationary increments, that is, (X t+s, X s ) = D X t for all s, t 0, independent increments, that is, (X t, X t0 ),..., (X tk, X tk ) are independent for all 0 t 0 t t k <, and càdlàg sample paths, that is, t X t is right continuous and has left limits under the product discrete topology induced by (). Remark 4.2. The first condition above, X 0 = 0 L S, is akin to the condition X 0 = 0 for R-valued Lévy processes. By the stationarity and independence of increments, there is no loss of generality in assuming X 0 = 0 L S. A combinatorial Lévy process Xx with initial state X 0 = x can be obtained from X = (X t, t 0) started at 0 L S by putting Xx := (X x t, t 0) with Xx t = X t x for all t 0. In discrete time, combinatorial Lévy processes are analogous to random walks, and most of their structural properties follow directly from Definition 4.. Definition 4.3 (Combinatorial random walk). A (combinatorial) random walk on L S with increment distribution µ and initial state X 0 is a discrete time process X = (X m, m 0) with (7) X m = D (X m, m ), m, where, 2,... are i.i.d. from µ. Theorem 4.4. Let X = (X m, m 0) be a discrete time combinatorial Lévy process on L S. Then there exists a unique probability measure µ on L S such that X = D X µ = (X m, m 0), where X µ is a combinatorial random walk on L S with initial state X 0 = 0 L S and increment distribution µ. In continuous time, a combinatorial Lévy process on L N must balance its behavior so that its sample paths satisfy the càdlàg requirement: since each L [n] is a finite state space, X [n] := (X t [n], t 0) can jump only finitely often in bounded time intervals. On the other hand, since we have ruled out the case i = = i k = 0, X = (X t, t 0) evolves on an uncountable state space and is defined at an uncountable set of times; therefore, X can experience infinitely many discontinuities in any bounded time interval. Condition (8) in Theorem 4.5 strikes the balance. Theorem 4.5. Let X = (X t, t 0) be a continuous time combinatorial Lévy process on L N. Then there is a unique measure µ on L N satisfying (8) µ({0 L N }) = 0 and µ({m L N : M [n] 0 L }) < for all n N [n] such that the infinitesimal jump rates of X satisfy (9) lim t 0 t P{X t d } = µ(d ), L N \{0 L N }, where convergence in (9) is understood in the sense of vague convergence of σ-finite measures. The limit in (9) is well defined on account of the Feller property for combinatorial Lévy processes, as we now discuss. The stationary and independent increments assumptions imply that X is a time homogeneous Markov process with transition law determined by the Markov semigroup Q = (Q t, t 0), where (20) Q t g(m) := Eg(X t M), t 0, for all bounded, continuous functions g : L N R and all M L N. We call Q a Feller semigroup and say that X has the Feller property if lim t 0 Q t g(m) = g(m) for all M L N and

11 M Q t g(m) is continuous for every t > 0, for all bounded, continuous g : L N R. COMBINATORIAL LÉVY PROCESSES Proposition 4.6. An L N -valued process X = (X t, t 0) is a combinatorial Lévy process if and only if X [n] = (X t [n], t 0) is a combinatorial Lévy process on L [n] for every n =, 2,.... From Proposition 4.6, we deduce the Feller property for combinatorial Lévy processes. Corollary 4.7. Every combinatorial Lévy process has the Feller property. Definition 4.8 (σ-finite measures). A measure µ on L N is σ-finite if it satisfies (8). Given a σ-finite measure µ, we construct X µ = (Xt, t 0) from a Poisson point process = {(t, t )} [0, ) L N with intensity measure dt µ. For each n N, we construct Xµ [n] = (X [n] t, t 0) on L [n] by putting X [n] = 0 L 0 [n] and X [n] t X [n] t where X [n] t = X [n] t t [n], if (t, t ) and t [n] 0 L [n], and = X [n] t otherwise, := lim s t X [n] s is the state of X [n] µ at the instant before time t. (Notice that by the last condition we tacitly construct X [n] µ to be constant between atom times of.) Since we construct each X [n] µ from the same Poisson point process, the collection (X [n], n N) is mutually compatible, that is, X [n] µ [m] := (X [n] t [m], t 0) = X [m] µ for every m n, and, thus, determines a unique process X µ = (Xt, t 0) on L N. Definition 4.9 (Canonical Lévy processes). We call X µ a µ-canonical Lévy process. Theorem 4.0. Let X be a combinatorial Lévy process with rate measure µ as in (8). Then X = D X µ, where X µ is a µ-canonical Lévy process. Conversely, every combinatorial Lévy process X has the same finite-dimensional distributions as some canonical Lévy process corresponding to a σ-finite measure µ Exchangeable processes. Definition 4. (Exchangeable Lévy process). An L S -valued process X = (X t, t 0) is exchangeable if X = D X σ for all permutations σ : S S that fix all but finitely many elements of S. The special case of graph-valued Lévy processes relates to recent work on the theory of graph limits [29] and dynamic random networks [6]. Definition 3.6 extends the notion of graph limit to that of a combinatorial limit for general L-structures. By projecting into the appropriate combinatorial limit space, the preceding theorems specialize nicely to the exchangeable setting. Recall that the limit space E L consists of exchangeable, dissociated probability measures on L N. Given a measure ν on E L, we write ν to denote the exchangeable measure it induces on L N by (2) ν (S) := W(S)ν(dW), S L N. E L As long as ν is a probability measure on E L, ν is a probability measure on L N, but the definition in (2) is well defined for arbitrary positive measures ν. For any combinatorial Lévy process X = (X t, t 0), we write X = ( X t, t 0) to denote its projection into E L, if it exists. The next theorem says that X always exists for exchangeable combinatorial Lévy processes.

12 2 HARRY CRANE Theorem 4.2. Let X = (X m, m 0) be an exchangeable combinatorial Lévy process in discrete time. Then there exists a unique probability measure ν on E L such that the increments of X are independent and identically distributed according to ν. Moreover, the projection X = ( X m, m 0) exists almost surely and is a Markov chain Lévy Itô structure. The final theorems explicitly characterize the measure µ guaranteed by Theorem 4.0 for continuous time processes. A formal statement requires more notation. We often deal with unordered multisets, for which we also write {x,..., x n } with the understanding that x,..., x n need not be distinct. We can also express any multiset x = {x,..., x n } by {i m i : i }, where m i = {j [n] : x j = i} is the multiplicity of element i in x. For example, x = {,, 2, 2, 2} has m = 2, m 2 = 3, and m j = 0 for j 3 so that x { 2, 2 3 }, omitting elements with multiplicity 0 for convenience. Given two multisets x, x with multiplicities m = (m, m 2,...) and m = (m, m,...), respectively, we write x x 2 to denote that m i m i for all i and we define the intersection x x to be the multiset with multiplicities m i m i for each i. When necessary, we write [x] := {j : m j > 0} to denote the set of elements in x without multiplicity. We apply the same notation for ordered multisets x = (x,..., x n ) when the order of elements is inconsequential, as in the conditions of (22) below. Let L = (i,..., i k ) be a signature and s = {s,..., s q } N be a multiset, for some q = 0,,..., i k. For any M L N, we define an L-structure M s = (N; M s,,..., M s,k ) by M (22) M s,j (a) = j (a), a s, a s, [a] = [s], M j (a), a > s, s a, [a] [s], a N i j, j =,..., k. 0, otherwise, Therefore, M s is the L-structure that corresponds to M on supersets of s and to 0 L N otherwise. We call M s the s-substructure of M. Remark 4.3. The two separate conditions in (22) are needed to fully capture all possible behaviors in our main theorem below. The subset s = {s,..., s q } in (22) represents the elements indexing the chosen substructure of M. If s i j for some component j =,..., k of the signature L = (i,..., i k ), then M (a) is nonzero only if a is a proper subset of s in the sense that the multiplicities of a are no s,j greater than s and all elements in a are also in s. If s < i j, then M s is nonzero only if all elements of s appear in a with multiplicity at least their multiplicity in s. Some examples should clarify this definition. Let L = (, 2) so that M = (N; A, E) is a set A N together with a graph (N; E). For s = {}, M s retains only relations in M involving element. Specifically, M (a) = 0 for all s,j tuples a except possibly those containing element : { M s, ((j)) = M (()), j =, and 0, otherwise, { M s,2 ((j, M2 ((j, j j )) = )), j = or j =, 0, otherwise. With regard to (22), we have s = so that M is determined by the top line of (22), with s, the only nontrivial contribution from a = (), and M is determined by the second line s,2 of (22), with nontrivial contributions from all a such that [a] {}. We note the difference

13 COMBINATORIAL LÉVY PROCESSES 3 when s = {, }, which has [s] = {} but should not be confused with the singleton {} in the context of (22). In this case, { M s, ((j)) = M (()), j =, and 0, otherwise, { M s,2 ((j, M2 ((, )), (j, j j )) = ) = (, ), 0, otherwise. Once again, M is determined by the top line of (22), for which the only nontrivial s, contribution must have [a] = {} and, therefore, a = (). In contrast to the case s = {}, however, the top line of (22) also applies to M, since we now have a s. The contribution s,2 M s,2 (a) is nontrivial only if [a] = [s] = {}, that is, a = (, ). Therefore, M {} and M {,} are different structures in general. As a special case, we point out that M = M for all M L N. Any multiset s = {s,..., s q } N determines a partition of the integer q, written λ(s) = λ 2 λ 2 q λ q, where λ j := {r s : {i [q] : s i = r} = j}, j =,..., q, is the number of elements that appear with multiplicity j in s. In general we write λ q to indicate that λ = λ 2 λ 2 q λ q is a partition of the integer q, which must satisfy λ i 0 for all i =,..., q and q i= iλ i = q. For j =,..., k and s = {s,..., s q }, q = 0,,..., i j, we can express each component M s,j of M s as a structure with signature (i j q) k j = (i j q,..., i j q) with k j equal arities, where ( ) ij q! k j := i j q q l= l!λ l for λ(s) = λ 2 λ 2 q λ q. (Note that k j is the number of all possible ways to insert the elements of s in an i j -tuple in any possible order.) For example, consider the case of i j = 3 and s = {, 2}, so that q = 2, λ(s) = 2 2 0, and k j = 6 corresponds to the six tuples of the form (,, 2), (, 2, ), (,, 2), (2,, ), (, 2, ), (2,, ), where entries can be filled with arbitrary indices. In this case, we express M s,j = (N; M s,j,,..., M s,j,6 ), where each M N ij q = N. With the indices l =,..., 6 corresponding to the ordering of tuples above, we have, for s,j,l example, M s,j, ((a)) = M j((a,, 2)), M s,j,2 ((a)) = M j((a, 2, )), M s,j,3 ((a)) = M j((, a, 2)), and so on. For s N with λ(s) = λ, we write L λ to denote the signature of M s. For every s N, we define s by (23) M s = ( M s,,..., M s,k ), where M s,j is the combinatorial limit of M s,j as an (i j q) k j -structure, with any prespecified convention for ordering the components of M s,j = (M s,,..., M s,k j ). We write M s = 0 if and only if M s,j = 0 (i j q) k j for all j =,..., k, where recall 0 L is the combinatorial limit of the empty structure 0 L with signature L. N

14 4 HARRY CRANE For any λ = λ q λ q q, we define π λ = (π,..., π p ) by π π 2 π p > 0 such that π + + π p = q and {k N : {i [q] : π i = k} = j} = λ j for each j =,..., q. We define the canonical λ-multiset by s λ = { π, 2 π 2,..., p π p } so that each i appears π i times in s λ. For example, if λ = then π λ = (4, 2,, ) and s λ = {,,,, 2, 2, 3, 4}. For any s N with λ(s) = λ, we index s = {s π,..., sπ p p } so that s i < s i+ whenever π i = π i+ and we define the canonical mapping σ s,λ : [p] s by σ s,λ (i) = s i for each i =,..., p. For example, let s = {,, 3, 4, 4, 5, 5, 5, 5} so that λ(s) = , s λ = {,,,, 2, 2, 3, 3, 4}, and π λ = (4, 2, 2, ). Then we write s = {5 4, 2, 4 2, 3 } and σ s,λ : [4] s assigns σ s,λ () = 5, σ s,λ (2) =, σ s,λ (3) = 4, and σ s,λ (4) = 3, so that s σ s,λ = s λ. The above preparation anticipates Theorem 4.4 in which we decompose exchangeable σ-finite measures on L N according to how they handle various substructures. Below we write µ λ to denote a measure on L N that satisfies (8), (24) is invariant with respect to permutations that fix s λ, (25) M s λ = M for µ λ -almost every M L N, and (26) {s : M s 0} = s λ for µ λ -almost every M L N, where the intersection of multisets is defined at the beginning of Section 4.3. We then define (27) µ λ ( ) = µ λ ({M L N : M σ s,λ }). s N:λ(s)=λ For example, let λ = be the only partition of integer and { c, i =, µ λ ((N; {i})) = 0, otherwise, for some c > 0. Then µ λ satisfies (8), (24), (25), and (26). For any k >, we note that s = {k} has λ =, π λ = (), and σ s,λ () = k so that M σ s,λ = (N; {}) if and only if M = (N; {k}). In this case, µ λ assigns mass c to each singleton subset (N; {k}), k N, so µ is exchangeable λ and satisfies (8). Compare the definition of µ to that of c i= ɛ i in Theorem 2.6. Theorem 4.4 (Lévy Itô Khintchine representation for combinatorial Lévy processes). Let L = (i,..., i k ) be any signature and X = (X t, t 0) be an exchangeable combinatorial Lévy process on L N. Then there exists a unique measure ν 0 on E L satisfying (28) ν 0 ({0 L }) = 0 and ( W({0 L [i k ] }))ν 0(dW) <, E L and measures µ λ on L N satisfying (8), (24), (25), and (26) such that (29) µ = ν 0 + µ λ, q=,...,i k λ q where λ q denotes that λ is a partition of q and µ is defined in (27). λ

15 COMBINATORIAL LÉVY PROCESSES 5 We call (29) the Lévy Itô Khintchine representation for exchangeable combinatorial Lévy processes. In a precise sense, see Theorem 4.5, ν describes the discrete component of X 0 and the µ λ decompose the continuous component of X. Theorem 4.5. Let X = (X t, t 0) be an exchangeable Lévy process on L N. Then the projection X = ( X t, t 0) into E L exists almost surely and is a Feller process. Moreover, the sample paths of X are continuous except at the times of jumps from the ν measure in (29). 0 Much of our remaining effort is dedicated to proving Theorems 4.4 and 4.5. We organize the next few sections as follows. We first illustrate the above theorems in specific, concrete cases. We then discuss combinatorial limits and prove a precursor to Theorem 4.4 before deducing the main theorems. 5. Examples We couch the above theorems in terms of some specific combinatorial Lévy processes, beginning with a summary of the set-valued Lévy processes from Section 2 and then moving on to graph-valued processes. Finally, we combine set- and graph-valued processes to demonstrate how higher order structures evolve according to (29). 5.. Set-valued Lévy processes. In Section 2, we discussed combinatorial Lévy processes in the special case when L = () and X = (X t, t 0) evolves on the space of subsets of N. In this case, the combinatorial limit of (N; A) is determined by the limiting frequency of elements in a subset A N, π(a) = lim n n A [n]. de Finetti s theorem implies that the marginal distribution of X at any fixed time t 0 is determined by a unique probability measure ν on [0, ] as in (4). In the context of Theorem 4.4, the behavior of X on L N is described by a measure µ = ν + c i= ɛ i with components defined as in Theorem 2.6. The first component ν is induced from a measure ν satisfying (5), the analog to (28) in the special case of set-valued processes. The second component c i= ɛ i plays the role of µ in (29) since λ = is the only partition of the integer. The only nontrivial measures on the (0)-structure that satisfy (8), (24), (25), and (26) must be of the form µ ({ }) = 0 and µ ({}) = c 0. Our definition of µ λ in (27) gives µ ( ) = c i= ɛ i( ). The contribution of µ to the characteristic measure of X is as discussed previously: each i N changes status independently at rate c 0, while the rest of X remains constant Graph-valued Lévy processes. Let X = (X t, t 0) be a Lévy process on the space of directed graphs, possibly with self-loops, so that X evolves on the space of L-structures with L = (2). By Theorem 4.4, the first component of µ in (29) is a measure ν 0 on the space of graph limits satisfying (28). The second component is decomposed according to the three partitions, 2 2 0, and 0 2 of the integers and 2 as follows. ( ) µ is a measure on L N for which almost every M = (N, E) has M = M and at least {} one of the conditions n n lim n n {(, j) E} > 0 or lim n {(j, ) E} > 0 n holds. j= j=

16 6 HARRY CRANE ( ) µ 2 20 assigns 0 mass to all M = (N; E) except that for which at least one of (, 2) E and (2, ) E holds and (i, j) E otherwise. ( 0 2 ) µ 0 2 assigns 0 mass to all M = (N; E) except that for which (, ) E and (i, j) E otherwise. The jump rates of X are determined by µ = ν + µ + µ + µ discontinuity in X, either At the time of a (0) a strictly positive proportion of edges changes status according to a σ-finite measure ν on countable graphs, 0 ( ) a positive proportion of edges incident to a specific vertex changes status and other edges stay fixed, ( ) edges involving a specific pair {i, j}, i j, change status and the rest of the graph stays fixed, or ( 0 2 ) a single self-loop (i, i) changes status for a specific i N and the rest of the graph stays fixed. In this special case, the limit process X = ( X t, t 0) evolves on the space of graph limits. Lovász and Szegedy [29] introduced the term graph limit in 2006, but a more general concept originates with the Aldous Hoover theorem in the late 970s; see [5, Theorem 4.] Networks with a distinguished community. Combining the structures in the previous two sections, we get signature L = (, 2), which corresponds to a structure M = (N; A, E) with A N and E N N. In this case, a combinatorial Lévy process X = (X t, t 0) offers the interpretation as the evolution of a network along with a distinguished community of its vertices. As in the previous section, we must consider partitions of integers and 2, so Theorem 4.4 characterizes exchangeable processes X by a σ-finite measure ν 0 on E L and measures µ, µ 2 2 0, µ 0 2. The ν 0 measure governs a joint evolution of the community and the network such that atoms from ν 0 cause a positive proportion of elements to change community status and/or a positive proportion of edges to change status. The µ λ measures play a similar role to Section 5.2 with some modifications. For, µ allows for the status of element to change in the subset A as well as a change to a positive proportion of edges incident to element as in Section 5.2. For 2 2 0, µ 2 20 is just as in Section 5.2: there is a change to at least one of the edges (, 2) and (2, ) and no change in the community structure A. For 0 2, µ 0 2 allows for a change to the status of element in the community structure as well as a change to the status of edge (, ) in E. 6. Characterization of exchangeable σ-finite measures 6.. Limits of combinatorial structures. Recall definition (4) of the limiting densities of a structure M. Theorem 6. (Aldous Hoover theorem for L-structures [4, 24]). Let L = (i,..., i k ) be a signature and M be an exchangeable L-structure over N. Then there exists a measurable function g = (g,..., g k ) with g j : [0, ] 2i j {0, } for each j =,..., k such that M = D M g = (N; M g,..., Mg ), where k M g j (a) = g j((ξ s ) s a ), a = (a,..., a ij ) N i j,

17 COMBINATORIAL LÉVY PROCESSES 7 for (ξ s ) s N: s ik a collection of i.i.d. Uniform[0, ] random variables. In particular, M is conditionally dissociated given its tail σ-field. Proposition 6.2. Let M = (N; M,..., M k ) be an exchangeable L-structure. Then δ(a; M) exists almost surely for every A L [m], for all m =, 2,.... Moreover, the collection (δ(a; M), A m N L [m] ) exists almost surely and determines a unique probability measure M on L N. Proof. In addition to being exchangeable, we first assume that M is dissociated, that is, M S and M T are independent whenever S and T are disjoint. For a fixed L-structure A = ([m]; A,..., A k ) over [m], we define Z n := n m ϕ:[m] [n] {M ϕ [n] = A}, for each n =, 2,.... Under uniform selection of an injection ϕ : [m] [n], the σ-field F n := σ Z n+, Z n+2,... induces P{M ϕ [n] = A F n} = Z n+, for each n =, 2,.... Thus, E(Z n F n ) = E n m ϕ:[m] [n] {M ϕ [n] = A} F n = Z n+ and (Z n, n N) is a reverse martingale. By the reverse martingale convergence theorem, there exists a random variable Z such that Z n Z almost surely. Since we have assumed M is dissociated, the limit depends only on the tail σ-field T = n N F n and, thus, is deterministic by the 0- law. That δ(a; M) exists for any exchangeable M follows by the fact that any exchangeable L-structure is conditionally dissociated given its tail σ-field, by Theorem 6.. Almost sure existence of the infinite collection (δ(a; M), A m N L [m] ) follows by countable additivity of probability measures. To prove that (δ(a; M), A m N L [m] ) determines a unique, exchangeable probability measure on L N, we consider A L [m] and A L [n] such that A [m] = A, for m n. For fixed r n, the definition in (3) implies δ(a ; M [r] ) = {M ϕ r n [r] = A } A L [n] : A [m] =A = = = = A L [n] : A [m] =A r n r n r n r m ϕ:[n] [r] ϕ:[n] [r] A L [n] : A [m] =A ϕ:[m] [r] ϕ:[m] [r] ϕ:[m] [r] = δ(a; M [r] ). {M ϕ [r] = A} {M ϕ [r] {M ϕ [r] = A} {M ϕ [r] = A } extensions of ϕ to [n] [r] = A} (r m)(r m ) (r n + ) Since r n is arbitrary, the probability measures induced on L [m] and L [n] are consistent for all m n. Carathéodory s extension theorem implies an extension to a unique probability

18 8 HARRY CRANE measure on L N. Since each of the finite space distributions is exchangeable, so is the distribution induced on L N. By Proposition 6.2, every exchangeable L-structure projects to a unique limit in E L. Conversely, the law of every exchangeable L-structure M is determined by a probability measure ν on E L such that M ν, where ν is defined in (2). By the projective structure of L N, ν is uniquely determined by the induced measures for every n N. ν (n) (M) := ν ({M L N : M [n] = M}), M L [n], 6.2. σ-finite measures. We are especially interested in Lévy processes that evolve in continuous time and, therefore, can jump infinitely often in bounded time intervals. To see the additional possibilities in this case, let L = () so that µ is an exchangeable measure on subsets of N. For c > 0, we define µ(dm) = c {M = (N; {i})}, i= which assigns mass c to the singleton subsets of N and, thus, has infinite total mass. For n =, 2,..., the restriction of µ to L [n] is { µ (n) c, M = ([n]; {i}), (M) =, otherwise, which is finite and exchangeable on L [n] \{([n]; )}. On the other hand, let c, c 0 and define µ(dm) = c {M = (N; {i})} + c {M = (N; {i, j})}, i= i= j=i+ so that singletons have mass c and doubletons have mass c. For n N, µ (n) (M) = c n n {M = ([n]; {i})} + c i= i= j=n+ n {M = ([n]; {i})} + c i= j=i+ n {M = ([n]; {i, j})}, which is finite only if c = 0. (The middle term in the above expression results because the restriction of any (N; {i, n + j}) to [n] is ([n]; {i}), for every j = n +, n + 2,....) Immediately, µ satisfies (8) only if it assigns no mass to doubleton subsets. The same argument rules out tripletons, quadrupletons, and so on. Theorem 6.3. Let L = (i,..., i k ) be a signature and µ be an exchangeable measure on L N that satisfies (8). Then there exists a unique measure ν 0 on E L satisfying (28) and measures µ λ satisfying (8), (24), (25), and (26) such that i k (30) µ = ν 0 + µ λ, for µ defined in (27). λ q= λ q We first show that any µ constructed as in (30) satisfies (8). Proposition 6.4. Let ν 0 satisfy (28). Then ν in (2) satisfies (8). 0

19 COMBINATORIAL LÉVY PROCESSES 9 Proof. The lefthand side of (8) follows immediately from the lefthand side of (28). For the righthand side of (8), we need to show ν 0 ({M L N : M [n] 0 L }) < for all n N. [n] We note that {M L N : M [n] 0 L [n] } = s={s < <s ik } [n] {M L N : M s 0 L s }, because M [n] = 0 L [n] only if M s is trivial for all s [n] with s = i k. By exchangeability of ν 0, ν 0 ({M L N : M s 0 L s }) = ( W({0 L [i k ] }))ν 0(dW) E L for every s = {s < < s ik } [n]. Thus, ν 0 ({M L N : M [n] 0 L [n] }) = ν 0 s={s < <s ik } [n] s={s < <s ik } [n] n i k <, by the righthand side of (28). The proof is complete. {M L N : M s 0 L s } ν 0 ({M L N : M s 0 L s }) E L ( W({0 L [i k ] }))ν 0(dW) Proposition 6.5. Let L = (i,..., i k ) be a signature, q =,..., i k, λ q, and suppose that µ λ is a measure on L N satisfying (8), (24), (25), and (26). Then µ, as defined in (27), satisfies (8). λ Proof. Let λ q for some q =,..., i k. By (25), M = M s λ for µ λ -almost every M L N ; whence, µ λ ({M L N : M σ s,λ [n] 0 L [n] }) = 0 for s = {s,..., s q } [n]. For s = {s,..., s q } [n], we observe that M σ s,λ [n] = M σ s,λ and, therefore, [n] µ λ ({M L N : M σ s,λ [n] 0 L [n] }) = µ λ({m L N : M σ s,λ 0 L [n] }) = µ λ({m L N : M [n] 0 L [n] }). It follows that µ λ ({M L N : M [n] 0 L [n] }) = = = [n] s N:λ(s)=λ s [n]:λ(s)=λ s [n]:λ(s)=λ µ λ ({M L N : M σ s,λ [n] 0 L [n] }) µ λ ({M L N : M σ s,λ 0 L [n] [n] }) µ λ ({M L N : M [n] 0 L [n] }) n q µ λ ({M L N : M [n] 0 L [n] }) <,

EXCHANGEABLE MARKOV PROCESSES ON GRAPHS: FELLER CASE

EXCHANGEABLE MARKOV PROCESSES ON GRAPHS: FELLER CASE EXCHANGEABLE MARKOV PROCESSES ON GRAPHS: FELLER CASE HARRY CRANE Abstract. The transition law of every exchangeable Feller process on the space of countable graphs is determined by a σ-finite measure on

More information

HOMOGENEOUS CUT-AND-PASTE PROCESSES

HOMOGENEOUS CUT-AND-PASTE PROCESSES HOMOGENEOUS CUT-AND-PASTE PROCESSES HARRY CRANE Abstract. We characterize the class of exchangeable Feller processes on the space of partitions with a bounded number of blocks. This characterization leads

More information

Lebesgue Measure on R n

Lebesgue Measure on R n CHAPTER 2 Lebesgue Measure on R n Our goal is to construct a notion of the volume, or Lebesgue measure, of rather general subsets of R n that reduces to the usual volume of elementary geometrical sets

More information

Notes on Measure Theory and Markov Processes

Notes on Measure Theory and Markov Processes Notes on Measure Theory and Markov Processes Diego Daruich March 28, 2014 1 Preliminaries 1.1 Motivation The objective of these notes will be to develop tools from measure theory and probability to allow

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Lebesgue Measure on R n

Lebesgue Measure on R n 8 CHAPTER 2 Lebesgue Measure on R n Our goal is to construct a notion of the volume, or Lebesgue measure, of rather general subsets of R n that reduces to the usual volume of elementary geometrical sets

More information

Tree sets. Reinhard Diestel

Tree sets. Reinhard Diestel 1 Tree sets Reinhard Diestel Abstract We study an abstract notion of tree structure which generalizes treedecompositions of graphs and matroids. Unlike tree-decompositions, which are too closely linked

More information

Stochastic flows associated to coalescent processes

Stochastic flows associated to coalescent processes Stochastic flows associated to coalescent processes Jean Bertoin (1) and Jean-François Le Gall (2) (1) Laboratoire de Probabilités et Modèles Aléatoires and Institut universitaire de France, Université

More information

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS STEVEN P. LALLEY AND ANDREW NOBEL Abstract. It is shown that there are no consistent decision rules for the hypothesis testing problem

More information

arxiv: v1 [math.fa] 14 Jul 2018

arxiv: v1 [math.fa] 14 Jul 2018 Construction of Regular Non-Atomic arxiv:180705437v1 [mathfa] 14 Jul 2018 Strictly-Positive Measures in Second-Countable Locally Compact Non-Atomic Hausdorff Spaces Abstract Jason Bentley Department of

More information

Automorphism groups of wreath product digraphs

Automorphism groups of wreath product digraphs Automorphism groups of wreath product digraphs Edward Dobson Department of Mathematics and Statistics Mississippi State University PO Drawer MA Mississippi State, MS 39762 USA dobson@math.msstate.edu Joy

More information

Convergence of Feller Processes

Convergence of Feller Processes Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes

More information

The Lebesgue Integral

The Lebesgue Integral The Lebesgue Integral Brent Nelson In these notes we give an introduction to the Lebesgue integral, assuming only a knowledge of metric spaces and the iemann integral. For more details see [1, Chapters

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ).

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ). Connectedness 1 Motivation Connectedness is the sort of topological property that students love. Its definition is intuitive and easy to understand, and it is a powerful tool in proofs of well-known results.

More information

Measure Theory. John K. Hunter. Department of Mathematics, University of California at Davis

Measure Theory. John K. Hunter. Department of Mathematics, University of California at Davis Measure Theory John K. Hunter Department of Mathematics, University of California at Davis Abstract. These are some brief notes on measure theory, concentrating on Lebesgue measure on R n. Some missing

More information

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M. Lecture 10 1 Ergodic decomposition of invariant measures Let T : (Ω, F) (Ω, F) be measurable, and let M denote the space of T -invariant probability measures on (Ω, F). Then M is a convex set, although

More information

Foundations of Nonparametric Bayesian Methods

Foundations of Nonparametric Bayesian Methods 1 / 27 Foundations of Nonparametric Bayesian Methods Part II: Models on the Simplex Peter Orbanz http://mlg.eng.cam.ac.uk/porbanz/npb-tutorial.html 2 / 27 Tutorial Overview Part I: Basics Part II: Models

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Abstract Measure Theory

Abstract Measure Theory 2 Abstract Measure Theory Lebesgue measure is one of the premier examples of a measure on R d, but it is not the only measure and certainly not the only important measure on R d. Further, R d is not the

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

The matrix approach for abstract argumentation frameworks

The matrix approach for abstract argumentation frameworks The matrix approach for abstract argumentation frameworks Claudette CAYROL, Yuming XU IRIT Report RR- -2015-01- -FR February 2015 Abstract The matrices and the operation of dual interchange are introduced

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Basic Definitions: Indexed Collections and Random Functions

Basic Definitions: Indexed Collections and Random Functions Chapter 1 Basic Definitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery

More information

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures 36-752 Spring 2014 Advanced Probability Overview Lecture Notes Set 1: Course Overview, σ-fields, and Measures Instructor: Jing Lei Associated reading: Sec 1.1-1.4 of Ash and Doléans-Dade; Sec 1.1 and A.1

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Lecture 20 : Markov Chains

Lecture 20 : Markov Chains CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

More information

Jónsson posets and unary Jónsson algebras

Jónsson posets and unary Jónsson algebras Jónsson posets and unary Jónsson algebras Keith A. Kearnes and Greg Oman Abstract. We show that if P is an infinite poset whose proper order ideals have cardinality strictly less than P, and κ is a cardinal

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Measures. 1 Introduction. These preliminary lecture notes are partly based on textbooks by Athreya and Lahiri, Capinski and Kopp, and Folland.

Measures. 1 Introduction. These preliminary lecture notes are partly based on textbooks by Athreya and Lahiri, Capinski and Kopp, and Folland. Measures These preliminary lecture notes are partly based on textbooks by Athreya and Lahiri, Capinski and Kopp, and Folland. 1 Introduction Our motivation for studying measure theory is to lay a foundation

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory Part V 7 Introduction: What are measures and why measurable sets Lebesgue Integration Theory Definition 7. (Preliminary). A measure on a set is a function :2 [ ] such that. () = 2. If { } = is a finite

More information

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9 MAT 570 REAL ANALYSIS LECTURE NOTES PROFESSOR: JOHN QUIGG SEMESTER: FALL 204 Contents. Sets 2 2. Functions 5 3. Countability 7 4. Axiom of choice 8 5. Equivalence relations 9 6. Real numbers 9 7. Extended

More information

Filters in Analysis and Topology

Filters in Analysis and Topology Filters in Analysis and Topology David MacIver July 1, 2004 Abstract The study of filters is a very natural way to talk about convergence in an arbitrary topological space, and carries over nicely into

More information

MATHS 730 FC Lecture Notes March 5, Introduction

MATHS 730 FC Lecture Notes March 5, Introduction 1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists

More information

Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra

Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra D. R. Wilkins Contents 3 Topics in Commutative Algebra 2 3.1 Rings and Fields......................... 2 3.2 Ideals...............................

More information

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries Chapter 1 Measure Spaces 1.1 Algebras and σ algebras of sets 1.1.1 Notation and preliminaries We shall denote by X a nonempty set, by P(X) the set of all parts (i.e., subsets) of X, and by the empty set.

More information

Mean-field dual of cooperative reproduction

Mean-field dual of cooperative reproduction The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time

More information

Homotopy and homology groups of the n-dimensional Hawaiian earring

Homotopy and homology groups of the n-dimensional Hawaiian earring F U N D A M E N T A MATHEMATICAE 165 (2000) Homotopy and homology groups of the n-dimensional Hawaiian earring by Katsuya E d a (Tokyo) and Kazuhiro K a w a m u r a (Tsukuba) Abstract. For the n-dimensional

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

AN ALGEBRAIC APPROACH TO GENERALIZED MEASURES OF INFORMATION

AN ALGEBRAIC APPROACH TO GENERALIZED MEASURES OF INFORMATION AN ALGEBRAIC APPROACH TO GENERALIZED MEASURES OF INFORMATION Daniel Halpern-Leistner 6/20/08 Abstract. I propose an algebraic framework in which to study measures of information. One immediate consequence

More information

INTRODUCTION TO MARKOV CHAIN MONTE CARLO

INTRODUCTION TO MARKOV CHAIN MONTE CARLO INTRODUCTION TO MARKOV CHAIN MONTE CARLO 1. Introduction: MCMC In its simplest incarnation, the Monte Carlo method is nothing more than a computerbased exploitation of the Law of Large Numbers to estimate

More information

1 Stochastic Dynamic Programming

1 Stochastic Dynamic Programming 1 Stochastic Dynamic Programming Formally, a stochastic dynamic program has the same components as a deterministic one; the only modification is to the state transition equation. When events in the future

More information

Measure and integration

Measure and integration Chapter 5 Measure and integration In calculus you have learned how to calculate the size of different kinds of sets: the length of a curve, the area of a region or a surface, the volume or mass of a solid.

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

Some Background Material

Some Background Material Chapter 1 Some Background Material In the first chapter, we present a quick review of elementary - but important - material as a way of dipping our toes in the water. This chapter also introduces important

More information

Measures and Measure Spaces

Measures and Measure Spaces Chapter 2 Measures and Measure Spaces In summarizing the flaws of the Riemann integral we can focus on two main points: 1) Many nice functions are not Riemann integrable. 2) The Riemann integral does not

More information

Math 210B. Artin Rees and completions

Math 210B. Artin Rees and completions Math 210B. Artin Rees and completions 1. Definitions and an example Let A be a ring, I an ideal, and M an A-module. In class we defined the I-adic completion of M to be M = lim M/I n M. We will soon show

More information

Hardy-Stein identity and Square functions

Hardy-Stein identity and Square functions Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Introduction to Topology

Introduction to Topology Introduction to Topology Randall R. Holmes Auburn University Typeset by AMS-TEX Chapter 1. Metric Spaces 1. Definition and Examples. As the course progresses we will need to review some basic notions about

More information

A MODEL-THEORETIC PROOF OF HILBERT S NULLSTELLENSATZ

A MODEL-THEORETIC PROOF OF HILBERT S NULLSTELLENSATZ A MODEL-THEORETIC PROOF OF HILBERT S NULLSTELLENSATZ NICOLAS FORD Abstract. The goal of this paper is to present a proof of the Nullstellensatz using tools from a branch of logic called model theory. In

More information

DRAFT MAA6616 COURSE NOTES FALL 2015

DRAFT MAA6616 COURSE NOTES FALL 2015 Contents 1. σ-algebras 2 1.1. The Borel σ-algebra over R 5 1.2. Product σ-algebras 7 2. Measures 8 3. Outer measures and the Caratheodory Extension Theorem 11 4. Construction of Lebesgue measure 15 5.

More information

Integration on Measure Spaces

Integration on Measure Spaces Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

EXISTENCE AND UNIQUENESS OF INFINITE COMPONENTS IN GENERIC RIGIDITY PERCOLATION 1. By Alexander E. Holroyd University of Cambridge

EXISTENCE AND UNIQUENESS OF INFINITE COMPONENTS IN GENERIC RIGIDITY PERCOLATION 1. By Alexander E. Holroyd University of Cambridge The Annals of Applied Probability 1998, Vol. 8, No. 3, 944 973 EXISTENCE AND UNIQUENESS OF INFINITE COMPONENTS IN GENERIC RIGIDITY PERCOLATION 1 By Alexander E. Holroyd University of Cambridge We consider

More information

Operads. Spencer Liang. March 10, 2015

Operads. Spencer Liang. March 10, 2015 Operads Spencer Liang March 10, 2015 1 Introduction The notion of an operad was created in order to have a well-defined mathematical object which encodes the idea of an abstract family of composable n-ary

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

arxiv: v1 [math.co] 5 Apr 2019

arxiv: v1 [math.co] 5 Apr 2019 arxiv:1904.02924v1 [math.co] 5 Apr 2019 The asymptotics of the partition of the cube into Weyl simplices, and an encoding of a Bernoulli scheme A. M. Vershik 02.02.2019 Abstract We suggest a combinatorial

More information

Infinitely iterated Brownian motion

Infinitely iterated Brownian motion Mathematics department Uppsala University (Joint work with Nicolas Curien) This talk was given in June 2013, at the Mittag-Leffler Institute in Stockholm, as part of the Symposium in honour of Olav Kallenberg

More information

Empirical Processes: General Weak Convergence Theory

Empirical Processes: General Weak Convergence Theory Empirical Processes: General Weak Convergence Theory Moulinath Banerjee May 18, 2010 1 Extended Weak Convergence The lack of measurability of the empirical process with respect to the sigma-field generated

More information

18.175: Lecture 2 Extension theorems, random variables, distributions

18.175: Lecture 2 Extension theorems, random variables, distributions 18.175: Lecture 2 Extension theorems, random variables, distributions Scott Sheffield MIT Outline Extension theorems Characterizing measures on R d Random variables Outline Extension theorems Characterizing

More information

Stochastic Processes

Stochastic Processes qmc082.tex. Version of 30 September 2010. Lecture Notes on Quantum Mechanics No. 8 R. B. Griffiths References: Stochastic Processes CQT = R. B. Griffiths, Consistent Quantum Theory (Cambridge, 2002) DeGroot

More information

The small ball property in Banach spaces (quantitative results)

The small ball property in Banach spaces (quantitative results) The small ball property in Banach spaces (quantitative results) Ehrhard Behrends Abstract A metric space (M, d) is said to have the small ball property (sbp) if for every ε 0 > 0 there exists a sequence

More information

Classifying classes of structures in model theory

Classifying classes of structures in model theory Classifying classes of structures in model theory Saharon Shelah The Hebrew University of Jerusalem, Israel, and Rutgers University, NJ, USA ECM 2012 Saharon Shelah (HUJI and Rutgers) Classifying classes

More information

Equational Logic. Chapter Syntax Terms and Term Algebras

Equational Logic. Chapter Syntax Terms and Term Algebras Chapter 2 Equational Logic 2.1 Syntax 2.1.1 Terms and Term Algebras The natural logic of algebra is equational logic, whose propositions are universally quantified identities between terms built up from

More information

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1) 1.4. CONSTRUCTION OF LEBESGUE-STIELTJES MEASURES In this section we shall put to use the Carathéodory-Hahn theory, in order to construct measures with certain desirable properties first on the real line

More information

Equivalence of History and Generator ɛ-machines

Equivalence of History and Generator ɛ-machines MCS Codes: 37A50 37B10 60J10 Equivalence of History and Generator ɛ-machines Nicholas F. Travers 1, 2, 1, 2, 3, 4, and James P. Crutchfield 1 Complexity Sciences Center 2 Mathematics Department 3 Physics

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

1. Quivers and their representations: Basic definitions and examples.

1. Quivers and their representations: Basic definitions and examples. 1 Quivers and their representations: Basic definitions and examples 11 Quivers A quiver Q (sometimes also called a directed graph) consists of vertices and oriented edges (arrows): loops and multiple arrows

More information

On Backward Product of Stochastic Matrices

On Backward Product of Stochastic Matrices On Backward Product of Stochastic Matrices Behrouz Touri and Angelia Nedić 1 Abstract We study the ergodicity of backward product of stochastic and doubly stochastic matrices by introducing the concept

More information

Building Infinite Processes from Finite-Dimensional Distributions

Building Infinite Processes from Finite-Dimensional Distributions Chapter 2 Building Infinite Processes from Finite-Dimensional Distributions Section 2.1 introduces the finite-dimensional distributions of a stochastic process, and shows how they determine its infinite-dimensional

More information

STOCHASTIC PROCESSES Basic notions

STOCHASTIC PROCESSES Basic notions J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving

More information

arxiv: v2 [math.ag] 24 Jun 2015

arxiv: v2 [math.ag] 24 Jun 2015 TRIANGULATIONS OF MONOTONE FAMILIES I: TWO-DIMENSIONAL FAMILIES arxiv:1402.0460v2 [math.ag] 24 Jun 2015 SAUGATA BASU, ANDREI GABRIELOV, AND NICOLAI VOROBJOV Abstract. Let K R n be a compact definable set

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

Introduction to Dynamical Systems

Introduction to Dynamical Systems Introduction to Dynamical Systems France-Kosovo Undergraduate Research School of Mathematics March 2017 This introduction to dynamical systems was a course given at the march 2017 edition of the France

More information

7. Homotopy and the Fundamental Group

7. Homotopy and the Fundamental Group 7. Homotopy and the Fundamental Group The group G will be called the fundamental group of the manifold V. J. Henri Poincaré, 895 The properties of a topological space that we have developed so far have

More information

arxiv: v3 [math.pr] 18 Aug 2017

arxiv: v3 [math.pr] 18 Aug 2017 Sparse Exchangeable Graphs and Their Limits via Graphon Processes arxiv:1601.07134v3 [math.pr] 18 Aug 2017 Christian Borgs Microsoft Research One Memorial Drive Cambridge, MA 02142, USA Jennifer T. Chayes

More information

Reinforcement Learning

Reinforcement Learning Reinforcement Learning March May, 2013 Schedule Update Introduction 03/13/2015 (10:15-12:15) Sala conferenze MDPs 03/18/2015 (10:15-12:15) Sala conferenze Solving MDPs 03/20/2015 (10:15-12:15) Aula Alpha

More information

Erdős-Renyi random graphs basics

Erdős-Renyi random graphs basics Erdős-Renyi random graphs basics Nathanaël Berestycki U.B.C. - class on percolation We take n vertices and a number p = p(n) with < p < 1. Let G(n, p(n)) be the graph such that there is an edge between

More information

Applications of model theory in extremal graph combinatorics

Applications of model theory in extremal graph combinatorics Applications of model theory in extremal graph combinatorics Artem Chernikov (IMJ-PRG, UCLA) Logic Colloquium Helsinki, August 4, 2015 Szemerédi regularity lemma Theorem [E. Szemerédi, 1975] Every large

More information

DISTINGUISHING PARTITIONS AND ASYMMETRIC UNIFORM HYPERGRAPHS

DISTINGUISHING PARTITIONS AND ASYMMETRIC UNIFORM HYPERGRAPHS DISTINGUISHING PARTITIONS AND ASYMMETRIC UNIFORM HYPERGRAPHS M. N. ELLINGHAM AND JUSTIN Z. SCHROEDER In memory of Mike Albertson. Abstract. A distinguishing partition for an action of a group Γ on a set

More information

Generalized Pigeonhole Properties of Graphs and Oriented Graphs

Generalized Pigeonhole Properties of Graphs and Oriented Graphs Europ. J. Combinatorics (2002) 23, 257 274 doi:10.1006/eujc.2002.0574 Available online at http://www.idealibrary.com on Generalized Pigeonhole Properties of Graphs and Oriented Graphs ANTHONY BONATO, PETER

More information

Measurable Choice Functions

Measurable Choice Functions (January 19, 2013) Measurable Choice Functions Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/fun/choice functions.pdf] This note

More information

Characterizing Ideal Weighted Threshold Secret Sharing

Characterizing Ideal Weighted Threshold Secret Sharing Characterizing Ideal Weighted Threshold Secret Sharing Amos Beimel Tamir Tassa Enav Weinreb August 12, 2004 Abstract Weighted threshold secret sharing was introduced by Shamir in his seminal work on secret

More information

COMBINATORIAL GROUP THEORY NOTES

COMBINATORIAL GROUP THEORY NOTES COMBINATORIAL GROUP THEORY NOTES These are being written as a companion to Chapter 1 of Hatcher. The aim is to give a description of some of the group theory required to work with the fundamental groups

More information

THE MAXIMAL SUBGROUPS AND THE COMPLEXITY OF THE FLOW SEMIGROUP OF FINITE (DI)GRAPHS

THE MAXIMAL SUBGROUPS AND THE COMPLEXITY OF THE FLOW SEMIGROUP OF FINITE (DI)GRAPHS THE MAXIMAL SUBGROUPS AND THE COMPLEXITY OF THE FLOW SEMIGROUP OF FINITE (DI)GRAPHS GÁBOR HORVÁTH, CHRYSTOPHER L. NEHANIV, AND KÁROLY PODOSKI Dedicated to John Rhodes on the occasion of his 80th birthday.

More information

JUMPS IN SPEEDS OF HEREDITARY PROPERTIES IN FINITE RELATIONAL LANGUAGES

JUMPS IN SPEEDS OF HEREDITARY PROPERTIES IN FINITE RELATIONAL LANGUAGES JUMPS IN SPEEDS OF HEREDITARY PROPERTIES IN FINITE RELATIONAL LANGUAGES MICHAEL C. LASKOWSKI AND CAROLINE A. TERRY Abstract. Given a finite relational language L, a hereditary L-property is a class of

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Classification of root systems

Classification of root systems Classification of root systems September 8, 2017 1 Introduction These notes are an approximate outline of some of the material to be covered on Thursday, April 9; Tuesday, April 14; and Thursday, April

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Criteria for existence of semigroup homomorphisms and projective rank functions. George M. Bergman

Criteria for existence of semigroup homomorphisms and projective rank functions. George M. Bergman Criteria for existence of semigroup homomorphisms and projective rank functions George M. Bergman Suppose A, S, and T are semigroups, e: A S and f: A T semigroup homomorphisms, and X a generating set for

More information

Statistical Inference on Large Contingency Tables: Convergence, Testability, Stability. COMPSTAT 2010 Paris, August 23, 2010

Statistical Inference on Large Contingency Tables: Convergence, Testability, Stability. COMPSTAT 2010 Paris, August 23, 2010 Statistical Inference on Large Contingency Tables: Convergence, Testability, Stability Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu COMPSTAT

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

The Skorokhod reflection problem for functions with discontinuities (contractive case)

The Skorokhod reflection problem for functions with discontinuities (contractive case) The Skorokhod reflection problem for functions with discontinuities (contractive case) TAKIS KONSTANTOPOULOS Univ. of Texas at Austin Revised March 1999 Abstract Basic properties of the Skorokhod reflection

More information