Entropy and Ergodic Theory Notes 22: The Kolmogorov Sinai entropy of a measure-preserving system
|
|
- Andrea Parrish
- 6 years ago
- Views:
Transcription
1 Entropy and Ergodic Theory Notes 22: The Kolmogorov Sinai entropy of a measure-preserving system 1 Joinings and channels 1.1 Joinings Definition 1. If px, µ, T q and py, ν, Sq are MPSs, then a joining of them is a coupling λ P Probpµ, νq which is invariant under T ˆ S. The product measure µˆν is always a joining, but some pairs of systems have many others. Example. If π : px, µ, T q ÝÑ py, ν, Sq is a factor map, it has an associated graphical joining: grpµ, πqpuq : µtx : px, πpxqq P Uu for measurable U Ď X ˆ Y. It is the unique joining supported on the pt ˆ Sq-invariant subset graphpπq tpx, πpxqq : x P Xu. Example. A little more generally, given a diagram pz, θ, Rq ϕ π px, µ, T q py, ν, Sq, the associated joint distribution is the image measure: pϕ, πq θpuq θtz : pϕpzq, πpzqq P Uu. A graphical joining grpµ, πq is the joint distribution of id X and π. Definition 1 and the second example above have obvious generalizations to collections of more than two systems. 1
2 1.2 Channels Joinings show up all over ergodic theory, but in this course we mostly use them for pairs of sources. In this setting, we start with a simple but powerful description of general joinings provided by the disintegration theorem from Lecture 20. Definition 2. Let A and B be finite alphabets and let S A and S B be the respective shifts on A Z and B Z. A stationary channel from A Z to B Z is a kernel θ from A Z to B Z such that θ SA pxqps B V q θ x pv q for all x P A Z and measurable V Ď B Z. The next definition is also a generalization of an idea from the information theory part of the course. Definition 3. If px, B, µq is a probability space and θ is a stochastic kernel from px, Bq to py, C q, then the input-output measure or hookup of µ and θ is the measure on px ˆ Y, B b C q defined by pµ θqpuq : θ x ty P Y : px, yq P Uu µpdxq. Example. 1. A constant probability kernel is one for which θ x ν for some fixed probability measure ν on py, C q. In this case µ θ µ ˆ ν. 2. Any stationary code π : A Z ÝÑ B Z defines a deterministic channel according to θ x δ πpxq. If µ is a shift-invariant measure on A Z, then its hookup to this channel equals the graphical joining grpµ, πq. 3. If ra, θ 0, Bs is a DMC, then its infinite extension is the stationary channel from A Z to B Z defined by θ x : ą npz θ 0 p x n q. A stationary channel of this form is sometimes also called a DMC. 2
3 Stationary channels are the basic objects in various generalizations of the channel coding theorem beyond the memoryless setting. Here we use them as a way of describing joinings between sources. Theorem 4. Let ra Z, µs and rb Z, νs be sources and let λ be a joining of them. Then there is a stationary channel θ from A Z to B Z such that λ µ θ, and θ is unique up to agreement µ-a.e. Proof. On the probability space pa Z ˆ B Z, λq, let F be the σ-subalgebra of measurable sets which are lifted from A Z : F tu ˆ B Z : U Ď A Z measurableu. Applying the disintegration theorem to λ and F, we obtain a kernel px, yq ÞÑ λ px,yq from A Z ˆ B Z to itself such that i) for each measurable U Ď A Z ˆ B Z, the map px, yq ÞÑ λ px,yq puq is F - measurable, and ii) for each bounded measurable f : A Z ˆ B Z ÝÑ R, we have E λ pf F qpx, yq f dλ px,yq for λ-a.e. px, yq. Any function on A Z ˆ B Z which is F -measurable must depend only on the coordinate in A Z. So property (i) implies that λ px,yq is really just a function of x. We henceforth write it as λ x instead. Moreover, each of the maps x ÞÑ λ x puq is F -measurable as a function of px, yq if and only if it is measurable as a function of x alone, by the definition of F. Therefore we have identified λ with a kernel from A Z to A Z ˆ B Z. On the other hand, consider a bounded measurable function f on A Z, and define fpx, p yq : fpxq. Then f p is F -measurable, so property (ii) implies that fpxq fpx, p yq pfpx 1, y 1 q λ x pdx 1, dy 1 q fpx 1 q λ x pdx 1, dy 1 q for µ-a.e. x. By applying this to all functions f from a countable dense subset of CpA Z q, it follows that µ-a.e. x has the property that fpx 1 q λ x pdx 1, dy 1 q P CpA Z q. 3
4 This is possible only if λ x is supported on txu ˆ A Z. So for each x P X 0 we can write λ x δ x ˆ θ x for some θ x P ProbpB Z q. The map x ÞÑ θ x is now a kernel from A Z to B Z such that λ µ θ. This kernel is essentially unique because of the corresponding property of λ. Finally, observe from the shift-invariance of µ and λ that δ x ˆ S 1 B pθ S A pxqq µpdxq ps 1 A ˆ S 1 B q pδ x ˆ θ x q µpdxq ps 1 A ˆ S 1 B q λ λ Therefore, by essential uniqueness, we must have δ x ˆ θ x µpdxq. S 1 B pθ S A pxqq θ x µ-a.s. 2 The Kolmogorov Sinai theorem Theorem 5 (Monotonicity under factor maps). If there exists a factor map ra Z, µs ÝÑ rb Z, νs then hpνq ď hpµq. The proof first treats the case of a sliding block code, and then deduces the general case by an approximation. The first step is quite easy, but we need some preparations for the second step. The next lemma provides the approximations we need. Lemma 6 (Approximation of stationary codes by sliding-block). If ra Z, µs is a source, π : A Z ÝÑ B Z is a stationary code, and ε ą 0, then there is a sliding block code ϕ : A Z ÝÑ B Z such that µtx : π 0 pxq ϕ 0 pxqu ă ε. Proof. This is just simple measure theory: any finite valued measurable function on A Z can be approximated in measure by local functions. In order to use Lemma 6, we need the ability to control entropy rates under the kind of approximation that it provides. This is done using a version of Fano s inequality. 4
5 Lemma 7 (Fano s inequality for general sources). Let ra Z, µs and ra Z, νs be two sources with the same alphabet, and let λ be a joining. Let p : λtpx, yq : x 0 y 0 u. Then hpµq hpνq ď Hpp, 1 pq ` p logp A 1q. Proof. For each m P Z, let pα m, β m q be the m th coordinate projection from A Z ˆ A Z to A ˆ A. According to the measure λ on A Z ˆ A Z, the process pα n q n has joint distribution µ, and the process pβ n q n has joint distribution ν. So for each n ě 1 we have Hpµ n q Hpα 1,..., α n q ď Hpβ 1,..., β n q ` Hpα 1,..., α n β 1,..., β n q, by the monotonicity of entropy and the chain rule. By subadditivity of conditional entropy, the right-hand side above is at most Hpβ 1,..., β n q ` nÿ Hpα i β i q. By the original Fano inequality, each term in the sum here is at most i 1 Hpp, 1 pq ` p logp A 1q. Dividing by n and letting n ÝÑ 8, we obtain hpµq ď hpνq ` Hpp, 1 pq ` p logp A 1q. The corresponding inequality with µ and ν switched follows by symmetry. Proof of Theorem 5. Let π pπ n q n : ra Z, µs ÝÑ rb Z, νs be the factor map, and let α pα n q n and β pβ n q n be stationary stochastic processes with distributions µ and ν respectively. Our assertion about π implies that π α law β. Step 1. Suppose first that π 0 is local. This means that π 0 pxq π 1 0px m, x m`1,..., x m q for some m P N and some π 1 0 : A 2m`1 ÝÑ B. By stationarity, this turns into π n pxq π 1 0px n m, x n m`1,..., x n`m P Z, 5
6 and therefore pβ 1,..., β n q law pπ 1 pαq, π 2 pαq,..., π n pαqq `π 1 0pα 1 m,..., α 1`m q, π 1 0pα 2 m,..., α 2`m q,..., π 1 0pα n m,..., α n`m q. The right-hand side here is determined by pα 1 m,..., α n`m q, so stationarity and the monotonicity of entropy imply that Hpν n q Hpβ 1,..., β n q Hpπ 1 pαq, π 2 pαq,..., π n pαqq ď Hpα 1 m,..., α n`m q Hpµ n`2m q. Dividing by n and letting n ÝÑ 8, this gives hpνq ď hpµq, since m is fixed. Step 2. Now consider a general factor map π. Let ε ą 0. By Lemma 6 there is a sliding block code ϕ pϕ n q n such that By Step 1, we know that On the other hand, the joint distribution is a joining of ν and ϕ µ which satisfies Therefore Lemma 7 gives µtπ 0 ϕ 0 u ă ε. hpϕ µq ď hpµq. λ pπ, ϕq µ P ProbpA Z ˆ A Z q λtpy, y 1 q : y 0 y 1 0u µtπ 0 ϕ 0 u ă ε. hpνq ď hpϕ µq ` Hpε, 1 εq ` ε logp A 1q ď hpµq ` Hpε, 1 εq ` ε logp A 1q. The second and third terms on the right can be made arbitrarily small by choosing ε sufficiently small, so this completes the proof. Corollary 8 (Kolmogorov Sinai theorem). Isomorphic shift-systems have the same entropy rate. Finally, let us extend the definition of entropy rate to arbitrary MPSs. First, recall that if px, µ, T q is a MPS, A is a finite alphabet, and ϕ 0 : X ÝÑ A is measurable, then we obtain from these a stationary A-valued process ϕ pϕ 0 T n q npz. 6
7 Definition 9. The entropy rate of px, µ, T q and ϕ is hpµ, T, ϕq : hpϕ µq. The Kolmogorov Sinai or KS entropy of px, µ, T q is! ) hpµ, T q : sup hpµ, T, ϕq : A ă 8 and ϕ 0 : X ÝÑ A measurable. We could have made this definition much sooner. It is easily checked to be invariant under isomorphism. But the power of this definition is much clearer now that we have Theorem 5. If ϕ and ψ are two finite-valued processes on the same underlying MPS px, µ, T q, and if ϕ is an isomorphism of MPSs, then that theorem gives hpµ, T, ψq ď hpµ, T, ϕq. Therefore we may compute hpµ, T q as hpϕ µq whenever we can find a process ϕ on px, µ, T q which is an isomorphism to a shift-system. In that case the time-zero observable ϕ 0 is said to be generating. Because of this, our previous calculations of entropy rates for processes all give the KS entropies of the underlying systems: A Bernoulli shift ra Z, pˆz s has KS entropy equal to Hppq. If µ P ProbpA Z q is the joint distribution of a stationary Markov chain with transition kernel θ and equilibrium distribution π, then its KS entropy is ÿ πpaqh`pθp aq. apa A circle rotation has entropy zero. Indeed, for an irrational circle rotation, we previously saw that any non-trivial arc of T defines a generating observable of entropy rate zero. The case of a rational rotation is even simpler, since then any process it generates is periodic. The KS entropy is one of the most important invariants of a MPS. Its first major application, and Kolmogorov and Sinai s original motivation, was the following. Corollary 10. Two Bernoulli shifts ra Z, pˆz s and rb Z, qˆz s can be isomorphic only if Hppq Hpqq. Before the introduction of entropy, it was open whether in fact all Bernoulli shifts are isomorphic. It turns out that the condition Hppq Hpqq is also sufficient for isomorphism. This much harder result is Ornstein s theorem. We return to it later in the course. 7
8 3 Notes and remarks Joinings were introduced into ergodic theory in Furstenberg s classic paper [Fur67]. It remains highly influential, and is well worth reading. The survey [Šuj83] is a good place to start learning more about the study of general sources in information theory. References [Fur67] Harry Furstenberg. Disjointness in ergodic theory, minimal sets, and a problem in Diophantine approximation. Math. Systems Theory, 1:1 49, [Šuj83] Štefan Šujan. Ergodic theory, entropy, and coding problems of information theory. Kybernetika (Prague) Suppl., 19(1-4):67, TIM AUSTIN tim@math.ucla.edu URL: math.ucla.edu/ tim 8
Entropy and Ergodic Theory Lecture 28: Sinai s and Ornstein s theorems, II
Entropy and Ergodic Theory Lecture 28: Sinai s and Ornstein s theorems, II 1 Proof of the update proposition Consider again an ergodic and atomless source ra Z, µs and a Bernoulli source rb Z, pˆz s such
More informationEntropy and Ergodic Theory Lecture 27: Sinai s factor theorem
Entropy and Ergodic Theory Lecture 27: Sinai s factor theorem What is special about Bernoulli shifts? Our main result in Lecture 26 is weak containment with retention of entropy. If ra Z, µs and rb Z,
More informationEntropy and Ergodic Theory Notes 21: The entropy rate of a stationary process
Entropy and Ergodic Theory Notes 21: The entropy rate of a stationary process 1 Sources with memory In information theory, a stationary stochastic processes pξ n q npz taking values in some finite alphabet
More informationEntropy and Ergodic Theory Lecture 17: Transportation and concentration
Entropy and Ergodic Theory Lecture 17: Transportation and concentration 1 Concentration in terms of metric spaces In this course, a metric probability or m.p. space is a triple px, d, µq in which px, dq
More informationEntropy and Ergodic Theory Lecture 7: Rate-distortion theory
Entropy and Ergodic Theory Lecture 7: Rate-distortion theory 1 Coupling source coding to channel coding Let rσ, ps be a memoryless source and let ra, θ, Bs be a DMC. Here are the two coding problems we
More informationEntropy and Ergodic Theory Lecture 19: The ergodic theorems
Entropy and Ergodic Theory Lecture 19: The ergodic theorems 1 Some history: the ergodic hypothesis Ergodic theory takes its name from the ergodic hypothesis. This is an old idea of Boltzmann in statistical
More informationEntropy and Ergodic Theory Lecture 6: Channel coding
Entropy and Ergodic Theory Lecture 6: Channel coding 1 Discrete memoryless channels In information theory, a device which transmits information is called a channel. A crucial feature of real-world channels
More informationFURSTENBERG S THEOREM ON PRODUCTS OF I.I.D. 2 ˆ 2 MATRICES
FURSTENBERG S THEOREM ON PRODUCTS OF I.I.D. 2 ˆ 2 MATRICES JAIRO BOCHI Abstract. This is a revised version of some notes written ą 0 years ago. I thank Anthony Quas for pointing to a gap in the previous
More informationEntropy and Ergodic Theory Lecture 13: Introduction to statistical mechanics, II
Entropy and Ergodic Theory Lecture 13: Introduction to statistical mechanics, II 1 Recap and an example Recap: A ta 1, a 2,... u is a set of possible states for a single particle, so a state for a system
More informationADVANCE TOPICS IN ANALYSIS - REAL. 8 September September 2011
ADVANCE TOPICS IN ANALYSIS - REAL NOTES COMPILED BY KATO LA Introductions 8 September 011 15 September 011 Nested Interval Theorem: If A 1 ra 1, b 1 s, A ra, b s,, A n ra n, b n s, and A 1 Ě A Ě Ě A n
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationPRODUCT MEASURES AND FUBINI S THEOREM
CHAPTER 6 PRODUCT MEASURES AND FUBINI S THEOREM All students learn in elementary calculus to evaluate a double integral by iteration. The theorem justifying this process is called Fubini s theorem. However,
More informationMEASURE-THEORETIC ENTROPY
MEASURE-THEORETIC ENTROPY Abstract. We introduce measure-theoretic entropy 1. Some motivation for the formula and the logs We want to define a function I : [0, 1] R which measures how suprised we are or
More informationMAGIC010 Ergodic Theory Lecture Entropy
7. Entropy 7. Introduction A natural question in mathematics is the so-called isomorphism problem : when are two mathematical objects of the same class the same (in some appropriately defined sense of
More informationAPPROXIMATE HOMOMORPHISMS BETWEEN THE BOOLEAN CUBE AND GROUPS OF PRIME ORDER
APPROXIMATE HOMOMORPHISMS BETWEEN THE BOOLEAN CUBE AND GROUPS OF PRIME ORDER TOM SANDERS The purpose of this note is to highlight a question raised by Shachar Lovett [Lov], and to offer some motivation
More informationSolutions of exercise sheet 3
Topology D-MATH, FS 2013 Damen Calaque Solutons o exercse sheet 3 1. (a) Let U Ă Y be open. Snce s contnuous, 1 puq s open n X. Then p A q 1 puq 1 puq X A s open n the subspace topology on A. (b) I s contnuous,
More informationarxiv: v2 [math.ca] 13 May 2015
ON THE CLOSURE OF TRANSLATION-DILATION INVARIANT LINEAR SPACES OF POLYNOMIALS arxiv:1505.02370v2 [math.ca] 13 May 2015 J. M. ALMIRA AND L. SZÉKELYHIDI Abstract. Assume that a linear space of real polynomials
More informationMarkov Chains. Andreas Klappenecker by Andreas Klappenecker. All rights reserved. Texas A&M University
Markov Chains Andreas Klappenecker Texas A&M University 208 by Andreas Klappenecker. All rights reserved. / 58 Stochastic Processes A stochastic process X tx ptq: t P T u is a collection of random variables.
More informationThe categorical origins of entropy
The categorical origins of entropy Tom Leinster University of Edinburgh These slides: www.maths.ed.ac.uk/tl/ or Tom Leinster University of Edinburgh These slides: www.maths.ed.ac.uk/tl/ Maximum entropy
More informationGround States are generically a periodic orbit
Gonzalo Contreras CIMAT Guanajuato, Mexico Ergodic Optimization and related topics USP, Sao Paulo December 11, 2013 Expanding map X compact metric space. T : X Ñ X an expanding map i.e. T P C 0, Dd P Z`,
More informationMATH 360 Final Exam Thursday, December 14, a n 2. a n 1 1
MATH 36 Final Exam Thursday, December 4, 27 Name. The sequence ta n u is defined by a and a n (a) Prove that lim a n exists by showing that ta n u is bounded and monotonic and invoking an appropriate result.
More informationSOLUTIONS Math B4900 Homework 9 4/18/2018
SOLUTIONS Math B4900 Homework 9 4/18/2018 1. Show that if G is a finite group and F is a field, then any simple F G-modules is finitedimensional. [This is not a consequence of Maschke s theorem; it s just
More informationREAL ANALYSIS II TAKE HOME EXAM. T. Tao s Lecture Notes Set 5
REAL ANALYSIS II TAKE HOME EXAM CİHAN BAHRAN T. Tao s Lecture Notes Set 5 1. Suppose that te 1, e 2, e 3,... u is a countable orthonormal system in a complex Hilbert space H, and c 1, c 2,... is a sequence
More informationApproximation in the Zygmund Class
Approximation in the Zygmund Class Odí Soler i Gibert Joint work with Artur Nicolau Universitat Autònoma de Barcelona New Developments in Complex Analysis and Function Theory, 02 July 2018 Approximation
More informationDS-GA 1002: PREREQUISITES REVIEW SOLUTIONS VLADIMIR KOBZAR
DS-GA 2: PEEQUISIES EVIEW SOLUIONS VLADIMI KOBZA he following is a selection of questions (drawn from Mr. Bernstein s notes) for reviewing the prerequisites for DS-GA 2. Questions from Ch, 8, 9 and 2 of
More informationExponential homomorphism. non-commutative probability
The exponential homomorphism in non-commutative probability (joint work with Octavio Arizmendi) Texas A&M University March 25, 2016 Classical convolutions. Additive:» R» fpzq dpµ 1 µ 2 qpzq Multiplicative:»»
More informationNote that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < +
Random Walks: WEEK 2 Recurrence and transience Consider the event {X n = i for some n > 0} by which we mean {X = i}or{x 2 = i,x i}or{x 3 = i,x 2 i,x i},. Definition.. A state i S is recurrent if P(X n
More informationSingular integral operators and the Riesz transform
Singular integral operators and the Riesz transform Jordan Bell jordan.bell@gmail.com Department of Mathematics, University of Toronto November 17, 017 1 Calderón-Zygmund kernels Let ω n 1 be the measure
More informationREVERSALS ON SFT S. 1. Introduction and preliminaries
Trends in Mathematics Information Center for Mathematical Sciences Volume 7, Number 2, December, 2004, Pages 119 125 REVERSALS ON SFT S JUNGSEOB LEE Abstract. Reversals of topological dynamical systems
More informationNOTES ON SOME EXERCISES OF LECTURE 5, MODULE 2
NOTES ON SOME EXERCISES OF LECTURE 5, MODULE 2 MARCO VITTURI Contents 1. Solution to exercise 5-2 1 2. Solution to exercise 5-3 2 3. Solution to exercise 5-7 4 4. Solution to exercise 5-8 6 5. Solution
More informationEntropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information
Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information 1 Conditional entropy Let (Ω, F, P) be a probability space, let X be a RV taking values in some finite set A. In this lecture
More informationBackground on Metric Geometry, I. Facundo Mémoli Math and CSE Departments, OSU.
Math and CSE Departments, OSU. memoli@math.osu.edu 2014 Table of Contents 1. metric A question about nets metric metric : a first metric 2. Bibliography metric metric Burago, Burago and Ivanov s book [BBI01].
More informationLecture 4 Lebesgue spaces and inequalities
Lecture 4: Lebesgue spaces and inequalities 1 of 10 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 4 Lebesgue spaces and inequalities Lebesgue spaces We have seen how
More informationx log x, which is strictly convex, and use Jensen s Inequality:
2. Information measures: mutual information 2.1 Divergence: main inequality Theorem 2.1 (Information Inequality). D(P Q) 0 ; D(P Q) = 0 iff P = Q Proof. Let ϕ(x) x log x, which is strictly convex, and
More informationMultidimensional symbolic dynamics
Multidimensional symbolic dynamics (Minicourse Lecture 3) Michael H. chraudner Centro de Modelamiento Matemático Universidad de Chile mschraudner@dim.uchile.cl www.cmm.uchile.cl/ mschraudner 1st French-Chilean
More informationGaussian automorphisms whose ergodic self-joinings are Gaussian
F U N D A M E N T A MATHEMATICAE 164 (2000) Gaussian automorphisms whose ergodic self-joinings are Gaussian by M. L e m a ńc z y k (Toruń), F. P a r r e a u (Paris) and J.-P. T h o u v e n o t (Paris)
More informationIRRATIONAL ROTATION OF THE CIRCLE AND THE BINARY ODOMETER ARE FINITARILY ORBIT EQUIVALENT
IRRATIONAL ROTATION OF THE CIRCLE AND THE BINARY ODOMETER ARE FINITARILY ORBIT EQUIVALENT MRINAL KANTI ROYCHOWDHURY Abstract. Two invertible dynamical systems (X, A, µ, T ) and (Y, B, ν, S) where X, Y
More informationInformation Theory and Statistics Lecture 3: Stationary ergodic processes
Information Theory and Statistics Lecture 3: Stationary ergodic processes Łukasz Dębowski ldebowsk@ipipan.waw.pl Ph. D. Programme 2013/2014 Measurable space Definition (measurable space) Measurable space
More informationAdam Najdecki, Józef Tabor CONDITIONALLY APPROXIMATELY CONVEX FUNCTIONS
DEMONSTRATIO MATHEMATICA Vol. 49 No 06 Adam Najdecki, Józef Tabor CONDITIONALLY APPROXIMATELY CONVEX FUNCTIONS Communicated by A. Fryszkowski Abstract. Let X be a real normed space, V be a subset of X
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationPRINCIPLES OF ANALYSIS - LECTURE NOTES
PRINCIPLES OF ANALYSIS - LECTURE NOTES PETER A. PERRY 1. Constructions of Z, Q, R Beginning with the natural numbers N t1, 2, 3,...u we can use set theory to construct, successively, Z, Q, and R. We ll
More information3 hours UNIVERSITY OF MANCHESTER. 22nd May and. Electronic calculators may be used, provided that they cannot store text.
3 hours MATH40512 UNIVERSITY OF MANCHESTER DYNAMICAL SYSTEMS AND ERGODIC THEORY 22nd May 2007 9.45 12.45 Answer ALL four questions in SECTION A (40 marks in total) and THREE of the four questions in SECTION
More informationLecture 2: Homotopy invariance
Lecture 2: Homotopy invariance Wegivetwoproofsofthefollowingbasicfact, whichallowsustodotopologywithvectorbundles. The basic input is local triviality of vector bundles (Definition 1.12). Theorem 2.1.
More informationNOTES WEEK 13 DAY 2 SCOT ADAMS
NOTES WEEK 13 DAY 2 SCOT ADAMS Recall: Let px, dq be a metric space. Then, for all S Ď X, we have p S is sequentially compact q ñ p S is closed and bounded q. DEFINITION 0.1. Let px, dq be a metric space.
More informationEntropy and Ergodic Theory Lecture 24: Rokhlin s lemma
Etropy ad Ergodic Theory Lecture 24: Rokhli s lemma I the remaider of the course, our most substatial results are about the existece of factor maps betwee various pairs of MPSs or sources, sometimes with
More informationMean-field dual of cooperative reproduction
The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time
More informationLecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.
Lecture 10 1 Ergodic decomposition of invariant measures Let T : (Ω, F) (Ω, F) be measurable, and let M denote the space of T -invariant probability measures on (Ω, F). Then M is a convex set, although
More informationMATH 426, TOPOLOGY. p 1.
MATH 426, TOPOLOGY THE p-norms In this document we assume an extended real line, where is an element greater than all real numbers; the interval notation [1, ] will be used to mean [1, ) { }. 1. THE p
More informationLecture 19 - Covariance, Conditioning
THEORY OF PROBABILITY VLADIMIR KOBZAR Review. Lecture 9 - Covariance, Conditioning Proposition. (Ross, 6.3.2) If X,..., X n are independent normal RVs with respective parameters µ i, σi 2 for i,..., n,
More information1. Examples. We did most of the following in class in passing. Now compile all that data.
SOLUTIONS Math A4900 Homework 12 11/22/2017 1. Examples. We did most of the following in class in passing. Now compile all that data. (a) Favorite examples: Let R tr, Z, Z{3Z, Z{6Z, M 2 prq, Rrxs, Zrxs,
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More information25.1 Ergodicity and Metric Transitivity
Chapter 25 Ergodicity This lecture explains what it means for a process to be ergodic or metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces
More informationMathematical Finance
ETH Zürich, HS 2017 Prof. Josef Teichmann Matti Kiiski Mathematical Finance Solution sheet 14 Solution 14.1 Denote by Z pz t q tpr0,t s the density process process of Q with respect to P. (a) The second
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationRecall that if X is a compact metric space, C(X), the space of continuous (real-valued) functions on X, is a Banach space with the norm
Chapter 13 Radon Measures Recall that if X is a compact metric space, C(X), the space of continuous (real-valued) functions on X, is a Banach space with the norm (13.1) f = sup x X f(x). We want to identify
More informationMetric spaces and metrizability
1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively
More informationEntropy, mixing, and independence
Entropy, mixing, and independence David Kerr Texas A&M University Joint work with Hanfeng Li Let (X, µ) be a probability space. Two sets A, B X are independent if µ(a B) = µ(a)µ(b). Suppose that we have
More informationU e = E (U\E) e E e + U\E e. (1.6)
12 1 Lebesgue Measure 1.2 Lebesgue Measure In Section 1.1 we defined the exterior Lebesgue measure of every subset of R d. Unfortunately, a major disadvantage of exterior measure is that it does not satisfy
More information1 Topology Definition of a topology Basis (Base) of a topology The subspace topology & the product topology on X Y 3
Index Page 1 Topology 2 1.1 Definition of a topology 2 1.2 Basis (Base) of a topology 2 1.3 The subspace topology & the product topology on X Y 3 1.4 Basic topology concepts: limit points, closed sets,
More informationTempered Distributions
Tempered Distributions Lionel Fiske and Cairn Overturf May 9, 26 In the classical study of partial differential equations one requires a solution to be differentiable. While intuitively this requirement
More informationCHAPTER 8: EXPLORING R
CHAPTER 8: EXPLORING R LECTURE NOTES FOR MATH 378 (CSUSM, SPRING 2009). WAYNE AITKEN In the previous chapter we discussed the need for a complete ordered field. The field Q is not complete, so we constructed
More informationMariusz Jurkiewicz, Bogdan Przeradzki EXISTENCE OF SOLUTIONS FOR HIGHER ORDER BVP WITH PARAMETERS VIA CRITICAL POINT THEORY
DEMONSTRATIO MATHEMATICA Vol. XLVIII No 1 215 Mariusz Jurkiewicz, Bogdan Przeradzki EXISTENCE OF SOLUTIONS FOR HIGHER ORDER BVP WITH PARAMETERS VIA CRITICAL POINT THEORY Communicated by E. Zadrzyńska Abstract.
More informationVariational inequality formulation of chance-constrained games
Variational inequality formulation of chance-constrained games Joint work with Vikas Singh from IIT Delhi Université Paris Sud XI Computational Management Science Conference Bergamo, Italy May, 2017 Outline
More informationIntroduction to Dynamical Systems
Introduction to Dynamical Systems France-Kosovo Undergraduate Research School of Mathematics March 2017 This introduction to dynamical systems was a course given at the march 2017 edition of the France
More informationIntegration and Fourier Theory. Lecture 14
Integration and Fourier Theory Lecture 4 Morten Grud Rasmussen March 5, 03 Trigonometric Series Denote the unit circle in the complex plane by T tx P C x u ( T for torus ). Note that R Q t ÞÑ e it P T
More informationDYNAMICS ON THE CIRCLE I
DYNAMICS ON THE CIRCLE I SIDDHARTHA GADGIL Dynamics is the study of the motion of a body, or more generally evolution of a system with time, for instance, the motion of two revolving bodies attracted to
More informationEntropy and Ergodic Theory Lecture 15: A first look at concentration
Entropy and Ergodic Theory Lecture 15: A first look at concentration 1 Introduction to concentration Let X 1, X 2,... be i.i.d. R-valued RVs with common distribution µ, and suppose for simplicity that
More informationChapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries
Chapter 1 Measure Spaces 1.1 Algebras and σ algebras of sets 1.1.1 Notation and preliminaries We shall denote by X a nonempty set, by P(X) the set of all parts (i.e., subsets) of X, and by the empty set.
More informationhttp://www.math.uah.edu/stat/markov/.xhtml 1 of 9 7/16/2009 7:20 AM Virtual Laboratories > 16. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 1. A Markov process is a random process in which the future is
More informationdynamical Diophantine approximation
Dioph. Appro. Dynamical Dioph. Appro. in dynamical Diophantine approximation WANG Bao-Wei Huazhong University of Science and Technology Joint with Zhang Guo-Hua Central China Normal University 24-28 July
More informationThus, X is connected by Problem 4. Case 3: X = (a, b]. This case is analogous to Case 2. Case 4: X = (a, b). Choose ε < b a
Solutions to Homework #6 1. Complete the proof of the backwards direction of Theorem 12.2 from class (which asserts the any interval in R is connected). Solution: Let X R be a closed interval. Case 1:
More information2. Transience and Recurrence
Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times
More information# 1 χ ramified χ unramified
LECTURE 14: LOCAL UNCTIONAL EQUATION LECTURE BY ASI ZAMAN STANORD NUMBER THEORY LEARNING SEMINAR JANUARY 31, 018 NOTES BY DAN DORE AND ASI ZAMAN Let be a local field. We fix ψ P p, ψ 1, such that ψ p d
More information5 Birkhoff s Ergodic Theorem
5 Birkhoff s Ergodic Theorem Birkhoff s Ergodic Theorem extends the validity of Kolmogorov s strong law to the class of stationary sequences of random variables. Stationary sequences occur naturally even
More informationSTA 711: Probability & Measure Theory Robert L. Wolpert
STA 711: Probability & Measure Theory Robert L. Wolpert 6 Independence 6.1 Independent Events A collection of events {A i } F in a probability space (Ω,F,P) is called independent if P[ i I A i ] = P[A
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationBERNOULLI ACTIONS OF SOFIC GROUPS HAVE COMPLETELY POSITIVE ENTROPY
BERNOULLI ACTIONS OF SOFIC GROUPS HAVE COMPLETELY POSITIVE ENTROPY DAVID KERR Abstract. We prove that every Bernoulli action of a sofic group has completely positive entropy with respect to every sofic
More informationINTRODUCTION TO FURSTENBERG S 2 3 CONJECTURE
INTRODUCTION TO FURSTENBERG S 2 3 CONJECTURE BEN CALL Abstract. In this paper, we introduce the rudiments of ergodic theory and entropy necessary to study Rudolph s partial solution to the 2 3 problem
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationCHAPTER 7. Connectedness
CHAPTER 7 Connectedness 7.1. Connected topological spaces Definition 7.1. A topological space (X, T X ) is said to be connected if there is no continuous surjection f : X {0, 1} where the two point set
More informationEntropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory
Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory 1 The intuitive meaning of entropy Modern information theory was born in Shannon s 1948 paper A Mathematical Theory of
More informationSemilattice Modes II: the amalgamation property
Semilattice Modes II: the amalgamation property Keith A. Kearnes Abstract Let V be a variety of semilattice modes with associated semiring R. We prove that if R is a bounded distributive lattice, then
More informationLecture notes for Analysis of Algorithms : Markov decision processes
Lecture notes for Analysis of Algorithms : Markov decision processes Lecturer: Thomas Dueholm Hansen June 6, 013 Abstract We give an introduction to infinite-horizon Markov decision processes (MDPs) with
More informationLebesgue Measure. Dung Le 1
Lebesgue Measure Dung Le 1 1 Introduction How do we measure the size of a set in IR? Let s start with the simplest ones: intervals. Obviously, the natural candidate for a measure of an interval is its
More informationLECTURE 16: UNITARY REPRESENTATIONS LECTURE BY SHEELA DEVADAS STANFORD NUMBER THEORY LEARNING SEMINAR FEBRUARY 14, 2018 NOTES BY DAN DORE
LECTURE 16: UNITARY REPRESENTATIONS LECTURE BY SHEELA DEVADAS STANFORD NUMBER THEORY LEARNING SEMINAR FEBRUARY 14, 2018 NOTES BY DAN DORE Let F be a local field with valuation ring O F, and G F the group
More information215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that
15 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that µ ν tv = (1/) x S µ(x) ν(x) = x S(µ(x) ν(x)) + where a + = max(a, 0). Show that
More informationarxiv: v1 [math.gr] 1 Apr 2019
ON A GENERALIZATION OF THE HOWE-MOORE PROPERTY ANTOINE PINOCHET LOBOS arxiv:1904.00953v1 [math.gr] 1 Apr 2019 Abstract. WedefineaHowe-Moore propertyrelativetoasetofsubgroups. Namely, agroupg has the Howe-Moore
More informationDYNAMICAL CUBES AND A CRITERIA FOR SYSTEMS HAVING PRODUCT EXTENSIONS
DYNAMICAL CUBES AND A CRITERIA FOR SYSTEMS HAVING PRODUCT EXTENSIONS SEBASTIÁN DONOSO AND WENBO SUN Abstract. For minimal Z 2 -topological dynamical systems, we introduce a cube structure and a variation
More informationMATH41011/MATH61011: FOURIER SERIES AND LEBESGUE INTEGRATION. Extra Reading Material for Level 4 and Level 6
MATH41011/MATH61011: FOURIER SERIES AND LEBESGUE INTEGRATION Extra Reading Material for Level 4 and Level 6 Part A: Construction of Lebesgue Measure The first part the extra material consists of the construction
More informationStation keeping problem
Station keeping problem Eric Goubault Benjamin Martin Sylvie Putot 8 June 016 Benjamin Martin Station keeping problem 8 June 016 1 / 18 Introduction Hybrid autonomous systems Consider the system: 9x f
More informationJOININGS, FACTORS, AND BAIRE CATEGORY
JOININGS, FACTORS, AND BAIRE CATEGORY Abstract. We discuss the Burton-Rothstein approach to Ornstein theory. 1. Weak convergence Let (X, B) be a metric space and B be the Borel sigma-algebra generated
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More information4. Ergodicity and mixing
4. Ergodicity and mixing 4. Introduction In the previous lecture we defined what is meant by an invariant measure. In this lecture, we define what is meant by an ergodic measure. The primary motivation
More informationDiscrete Random Variables (cont.) Discrete Distributions the Geometric pmf
Discrete Random Variables (cont.) ECE 313 Probability with Engineering Applications Lecture 10 - September 29, 1999 Professor Ravi K. Iyer University of Illinois the Geometric pmf Consider a sequence of
More informationarxiv: v3 [math.ds] 28 Oct 2015
The Chowla and the Sarnak conjectures from ergodic theory point of view (extended version) H. El Abdalaoui J. Ku laga-przymus M. Lemańczyk T. de la Rue arxiv:40.673v3 [math.ds] 28 Oct 205 October 29, 205
More informationMidterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016
Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).
More information2.2 Some Consequences of the Completeness Axiom
60 CHAPTER 2. IMPORTANT PROPERTIES OF R 2.2 Some Consequences of the Completeness Axiom In this section, we use the fact that R is complete to establish some important results. First, we will prove that
More informationMath 4317 : Real Analysis I Mid-Term Exam 1 25 September 2012
Instructions: Answer all of the problems. Math 4317 : Real Analysis I Mid-Term Exam 1 25 September 2012 Definitions (2 points each) 1. State the definition of a metric space. A metric space (X, d) is set
More informationConsistency of the maximum likelihood estimator for general hidden Markov models
Consistency of the maximum likelihood estimator for general hidden Markov models Jimmy Olsson Centre for Mathematical Sciences Lund University Nordstat 2012 Umeå, Sweden Collaborators Hidden Markov models
More informationarxiv: v1 [math.ca] 4 Apr 2017
ON LOCALIZATION OF SCHRÖDINGER MEANS PER SJÖLIN Abstract. Localization properties for Schrödinger means are studied in dimension higher than one. arxiv:704.00927v [math.ca] 4 Apr 207. Introduction Let
More information