Kolmogorov Complexity and Diophantine Approximation

Similar documents
Hausdorff Measures and Perfect Subsets How random are sequences of positive dimension?

Kolmogorov-Loveland Randomness and Stochasticity

On Normal Numbers. Theodore A. Slaman. University of California Berkeley

On the classification of irrational numbers

RANDOMNESS BEYOND LEBESGUE MEASURE

Uniform Diophantine approximation related to b-ary and β-expansions

FURTHER NUMBER THEORY

Introduction to Hausdorff Measure and Dimension

Randomness Beyond Lebesgue Measure

The dimensional theory of continued fractions

VIC 2004 p.1/26. Institut für Informatik Universität Heidelberg. Wolfgang Merkle and Jan Reimann. Borel Normality, Automata, and Complexity

Chaitin Ω Numbers and Halting Problems

Notes 1 : Measure-theoretic foundations I

Measures of relative complexity

Irrationality Exponent, Hausdorff Dimension and Effectivization

Random Reals à la Chaitin with or without prefix-freeness

CISC 876: Kolmogorov Complexity

PARTIAL RANDOMNESS AND KOLMOGOROV COMPLEXITY

Randomness, probabilities and machines

PROBABILITY MEASURES AND EFFECTIVE RANDOMNESS

INCREASING THE GAP BETWEEN DESCRIPTIONAL COMPLEXITY AND ALGORITHMIC PROBABILITY

CALIBRATING RANDOMNESS ROD DOWNEY, DENIS R. HIRSCHFELDT, ANDRÉ NIES, AND SEBASTIAAN A. TERWIJN

uniform distribution theory

dynamical Diophantine approximation

Ten years of triviality

Symbolic Dynamics: Entropy = Dimension = Complexity

THE K-DEGREES, LOW FOR K DEGREES, AND WEAKLY LOW FOR K SETS

Exponents of irrationality and transcendence and effective Hausdorff dimension

Notes on Diophantine approximation and aperiodic order

An introduction to Geometric Measure Theory Part 2: Hausdorff measure

An Introduction to Ergodic Theory

Kolmogorov complexity and its applications

The Lebesgue Integral

QUARTIC POWER SERIES IN F 3 ((T 1 )) WITH BOUNDED PARTIAL QUOTIENTS. Alain Lasjaunias

Dirichlet uniformly well-approximated numbers

COS597D: Information Theory in Computer Science October 19, Lecture 10

DYADIC DIOPHANTINE APPROXIMATION AND KATOK S HORSESHOE APPROXIMATION

Degrees of Streams. Jörg Endrullis Dimitri Hendriks Jan Willem Klop. Streams Seminar Nijmegen, 20th April Vrije Universiteit Amsterdam

Diophantine approximation of beta expansion in parameter space

2 Plain Kolmogorov Complexity

SCHNORR DIMENSION RODNEY DOWNEY, WOLFGANG MERKLE, AND JAN REIMANN

Chapter 2: Introduction to Fractals. Topics

Lecture Notes Introduction to Ergodic Theory

Kolmogorov complexity and its applications

INTRODUCTION TO FURSTENBERG S 2 3 CONJECTURE

DRAFT MAA6616 COURSE NOTES FALL 2015

Effective randomness and computability

Almost Sure Convergence of a Sequence of Random Variables

Three hours THE UNIVERSITY OF MANCHESTER. 24th January

A statistical mechanical interpretation of algorithmic information theory

Lecture 3: Probability Measures - 2

6 Classical dualities and reflexivity

Waiting times, recurrence times, ergodicity and quasiperiodic dynamics

Measures. Chapter Some prerequisites. 1.2 Introduction

functions as above. There is a unique non-empty compact set, i=1

Computability Theory

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Are chaotic systems dynamically random?

Rudiments of Ergodic Theory

Randomness and Complexity of Sequences over Finite Fields. Harald Niederreiter, FAMS. RICAM Linz and University of Salzburg (Austria)

A Short Introduction to Ergodic Theory of Numbers. Karma Dajani

The Metamathematics of Randomness

Combinatorics, Automata and Number Theory

Measure and Integration: Concepts, Examples and Exercises. INDER K. RANA Indian Institute of Technology Bombay India

2 Lebesgue integration

Henk Bruin (University of Vienna) Carlo Carminati (University of Pisa) Stefano Marmi (Scuola Normale di Pisa) Alessandro Profeti (University of Pisa)

Probability and Measure

Three hours THE UNIVERSITY OF MANCHESTER. 31st May :00 17:00

Lectures on fractal geometry and dynamics

Program size complexity for possibly infinite computations

1.4 Outer measures 10 CHAPTER 1. MEASURE

Measure and Category. Marianna Csörnyei. ucahmcs

10 Typical compact sets

Measures and Measure Spaces

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Indeed, if we want m to be compatible with taking limits, it should be countably additive, meaning that ( )

Dirichlet uniformly well-approximated numbers

CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression

MEASURES AND THEIR RANDOM REALS

The Mysterious World of Normal Numbers

Lebesgue Measure on R n

EFFECTIVE FRACTAL DIMENSIONS

Quantitative recurrence for beta expansion. Wang BaoWei

Algorithmic Identification of Probabilities Is Hard

Correspondence Principles for Effective Dimensions

Symbolic Dynamics: Entropy = Dimension = Complexity

1 Introduction. 2 Binary shift map

Randomness and Halting Probabilities

3 Self-Delimiting Kolmogorov complexity

INITIAL POWERS OF STURMIAN SEQUENCES

LECTURE 15: COMPLETENESS AND CONVEXITY

Conference on Diophantine Analysis and Related Fields 2006 in honor of Professor Iekata Shiokawa Keio University Yokohama March 8, 2006

UNIFORM TEST OF ALGORITHMIC RANDOMNESS OVER A GENERAL SPACE

Diophantine approximation of fractional parts of powers of real numbers

Existence of a Limit on a Dense Set, and. Construction of Continuous Functions on Special Sets

Lecture 3: Randomness in Computation

Alan Turing in the Twenty-first Century: Normal Numbers, Randomness, and Finite Automata. Jack Lutz Iowa State University

Reconciling data compression and Kolmogorov complexity

University of York. Extremality and dynamically defined measures. David Simmons. Diophantine preliminaries. First results. Main results.

MATH41011/MATH61011: FOURIER SERIES AND LEBESGUE INTEGRATION. Extra Reading Material for Level 4 and Level 6

Transcription:

Kolmogorov Complexity and Diophantine Approximation Jan Reimann Institut für Informatik Universität Heidelberg Kolmogorov Complexity and Diophantine Approximation p. 1/24

Algorithmic Information Theory Algorithmic Information Theory classifies objects by their descriptive complexity. Kolmogorov Complexity and Diophantine Approximation p. 2/24

Algorithmic Information Theory Algorithmic Information Theory classifies objects by their descriptive complexity. Examples: A sequence of 10000 zeroes has low complexity, as there is a much shorter description. Any (infinite) sequence computable by an algorithm such as π has a short description (the algorithm). An outcome of an (unbiased) coin toss has high complexity. (There is no shorter description than the sequence itself.) Kolmogorov Complexity and Diophantine Approximation p. 2/24

Algorithmic Information Theory Algorithmic Information Theory classifies objects by their descriptive complexity. Examples: A sequence of 10000 zeroes has low complexity, as there is a much shorter description. Any (infinite) sequence computable by an algorithm such as π has a short description (the algorithm). An outcome of an (unbiased) coin toss has high complexity. (There is no shorter description than the sequence itself.) Rigorous Formulation: Kolmogorov complexity. Kolmogorov Complexity and Diophantine Approximation p. 2/24

Kolmogorov Complexity Kolmogorov complexity: U a universal Turing-machine. Def. for a binary string σ {0,1}, C(σ) = C U (σ) = min{ p : p {0,1}, U(p) = σ}, i.e. C(σ) is the length of the shortest program (for U) that outputs σ. Kolmogorov Complexity and Diophantine Approximation p. 3/24

Kolmogorov Complexity Kolmogorov complexity: U a universal Turing-machine. Def. for a binary string σ {0,1}, C(σ) = C U (σ) = min{ p : p {0,1}, U(p) = σ}, i.e. C(σ) is the length of the shortest program (for U) that outputs σ. Kolmogorov s invariance theorem: C is independent of U (up to a constant). Kolmogorov Complexity and Diophantine Approximation p. 3/24

Kolmogorov Complexity Kolmogorov complexity: U a universal Turing-machine. Def. for a binary string σ {0,1}, C(σ) = C U (σ) = min{ p : p {0,1}, U(p) = σ}, i.e. C(σ) is the length of the shortest program (for U) that outputs σ. Kolmogorov s invariance theorem: C is independent of U (up to a constant). The pigeonhole principle yields that for any length there are incompressible strings, C(σ) σ (in fact, most of them are). Kolmogorov Complexity and Diophantine Approximation p. 3/24

Digression: Euclid Revisited Suppose there are only finitely many primes p 1,...,p n. Kolmogorov Complexity and Diophantine Approximation p. 4/24

Digression: Euclid Revisited Suppose there are only finitely many primes p 1,...,p n. Then each number N can be factored N = p k 1 1 pk n n, each k i log N. Kolmogorov Complexity and Diophantine Approximation p. 4/24

Digression: Euclid Revisited Suppose there are only finitely many primes p 1,...,p n. Then each number N can be factored N = p k 1 1 pk n n, each k i log N. Therefore, N can be described by a program of length O(n log log N). (Identify natural numbers with their binary representation). Kolmogorov Complexity and Diophantine Approximation p. 4/24

Digression: Euclid Revisited Suppose there are only finitely many primes p 1,...,p n. Then each number N can be factored N = p k 1 1 pk n n, each k i log N. Therefore, N can be described by a program of length O(n log log N). (Identify natural numbers with their binary representation). Yields a contradiction if N is incompressible (C(N) log N). Kolmogorov Complexity and Diophantine Approximation p. 4/24

Kolmogorov complexity and Coding A prefix-free Turing machine is a TM with prefix-free domain. The prefix-free version of C (use only prefix free TMs) is denoted by K. Kolmogorov Complexity and Diophantine Approximation p. 5/24

Kolmogorov complexity and Coding A prefix-free Turing machine is a TM with prefix-free domain. The prefix-free version of C (use only prefix free TMs) is denoted by K. Kraft-Chaitin Theorem: {σ i } i N set of strings, {l i,l 2,...} sequence of natural numbers ( lengths ) such that 2 l i 1, i N then one can construct (primitive recursively) a prefix-free TM M and strings {τ i } i N, such that τ i = l i and M(τ i ) = σ i. Kolmogorov Complexity and Diophantine Approximation p. 5/24

Kolmogorov complexity and Coding Semimeasures: m : {0,1} [0, ) with m(σ) 1. σ {0,1} Kolmogorov Complexity and Diophantine Approximation p. 6/24

Kolmogorov complexity and Coding Semimeasures: m : {0,1} [0, ) with σ {0,1} m(σ) 1. There exists a maximal enumerable semimeasure m, i.e. m is enumerable from below, and for any enumerable semimeasure m it holds that m c m m for some constant c m. Kolmogorov Complexity and Diophantine Approximation p. 6/24

Kolmogorov complexity and Coding Semimeasures: m : {0,1} [0, ) with σ {0,1} m(σ) 1. There exists a maximal enumerable semimeasure m, i.e. m is enumerable from below, and for any enumerable semimeasure m it holds that m c m m for some constant c m. Coding Theorem: (Zvonkin-Levin) K(σ) = log m(σ) + c. Kolmogorov Complexity and Diophantine Approximation p. 6/24

Kolmogorov complexity and Coding Semimeasures: m : {0,1} [0, ) with σ {0,1} m(σ) 1. There exists a maximal enumerable semimeasure m, i.e. m is enumerable from below, and for any enumerable semimeasure m it holds that m c m m for some constant c m. Coding Theorem: (Zvonkin-Levin) K(σ) = log m(σ) + c. Identify randomness with incompressibility: Say an infinite binary sequence ξ is random if for all n, K(ξ n ) n c, for some constant c. Kolmogorov Complexity and Diophantine Approximation p. 6/24

Diophantine Approximation Diophantine Approximation classifies real numbers by how well they may be approximated by rational numbers. Kolmogorov Complexity and Diophantine Approximation p. 7/24

Diophantine Approximation Diophantine Approximation classifies real numbers by how well they may be approximated by rational numbers. Fundamental Theorem by Dirichlet: For any irrational α there exist infinitely many p/q (rel. prime) such that α p/q 1/q 2. (3) Kolmogorov Complexity and Diophantine Approximation p. 7/24

Diophantine Approximation Diophantine Approximation classifies real numbers by how well they may be approximated by rational numbers. Fundamental Theorem by Dirichlet: For any irrational α there exist infinitely many p/q (rel. prime) such that α p/q 1/q 2. (5) Such a series of rationals is obtained by the continued fraction expansion: α = [a 0,a 1,... ] = a 0 + a 1 + 1 a 2 + 1 1 a 3 +... a i N (6) Kolmogorov Complexity and Diophantine Approximation p. 7/24

Diophantine Approximation Diophantine Approximation classifies real numbers by how well they may be approximated by rational numbers. Fundamental Theorem by Dirichlet: For any irrational α there exist infinitely many p/q (rel. prime) such that α p/q 1/q 2. (7) Such a series of rationals is obtained by the continued fraction expansion: α = [a 0,a 1,... ] = a 0 + a 1 + 1 a 2 + 1 1 a 3 +... a i N (8) The expansion is finite if and only if α is rational. Kolmogorov Complexity and Diophantine Approximation p. 7/24

Diophantine Approximation In general, one cannot improve the factor 2 in Dirichlet s theorem. Kolmogorov Complexity and Diophantine Approximation p. 8/24

Diophantine Approximation In general, one cannot improve the factor 2 in Dirichlet s theorem. A number β is badly approximable if there exists a K such that p/q Q β p/q K/q 2. Kolmogorov Complexity and Diophantine Approximation p. 8/24

Diophantine Approximation In general, one cannot improve the factor 2 in Dirichlet s theorem. A number β is badly approximable if there exists a K such that p/q Q β p/q K/q 2. Examples of badly approximable numbers: Golden mean (1 + 5)/2 = [1,1,1,1,... ]. 0 < K < 5. 2/2 = [1,2,2,2,... ], in fact all irrational square roots. e mod 1 = [1,2,1,1,4,1,1,6,... ] not badly approximable. π mod 1 = [7,15,1,292,1,1,... ]??? Kolmogorov Complexity and Diophantine Approximation p. 8/24

Diophantine Approximation Algebraic numbers are close to badly approximable: Roth s Theorem: For any algebraic α, for any ε > 0, has only finitely many solutions. α p q 1 q 2+ε (9) Kolmogorov Complexity and Diophantine Approximation p. 9/24

Diophantine Approximation Algebraic numbers are close to badly approximable: Roth s Theorem: For any algebraic α, for any ε > 0, has only finitely many solutions. α p q 1 q 2+ε (10) However, there are numbers which are very well approximable. A Liouville number is an irrational α for which n p q α p q 1 q n. Example: 10 n!. Kolmogorov Complexity and Diophantine Approximation p. 9/24

Metric Diophantine Approximation What s the situation typically? Are most numbers rather well or rather badly approximable? Kolmogorov Complexity and Diophantine Approximation p. 10/24

Metric Diophantine Approximation What s the situation typically? Are most numbers rather well or rather badly approximable? For almost all numbers, the exponent 2 cannot be improved much: Khintchine s Theorem: Let ψ : N R + be such that lim n ψ(n) = 0. Define W ψ = {α : (p/q) α (p/q) < ψ(q)}. Then it holds, for Lebesgue measure λ, λw ψ = { 0, if kψ(k) <, 1, if kψ(k) =. Kolmogorov Complexity and Diophantine Approximation p. 10/24

Metric Diophantine Approximation What s the situation typically? Are most numbers rather well or rather badly approximable? For almost all numbers, the exponent 2 cannot be improved much: Khintchine s Theorem: Let ψ : N R + be such that lim n ψ(n) = 0. Define W ψ = {α : (p/q) α (p/q) < ψ(q)}. Then it holds, for Lebesgue measure λ, λw ψ = { 0, if kψ(k) <, 1, if kψ(k) =. So, well-approximable numbers are rather rare. Can we tell how rare? Kolmogorov Complexity and Diophantine Approximation p. 10/24

Hausdorff Measures Caratheodory-Hausdorff construction on metric spaces: let A X, X some seperable metric space, h : R R a monotone, increasing, continous on the right function with h(0) = 0, and let δ > 0. Kolmogorov Complexity and Diophantine Approximation p. 11/24

Hausdorff Measures Caratheodory-Hausdorff construction on metric spaces: let A X, X some seperable metric space, h : R R a monotone, increasing, continous on the right function with h(0) = 0, and let δ > 0. Define a set function H h δ (A) = inf { i h(diam(u i )) : A i U i, diam(u i ) δ }. Kolmogorov Complexity and Diophantine Approximation p. 11/24

Hausdorff Measures Caratheodory-Hausdorff construction on metric spaces: let A X, X some seperable metric space, h : R R a monotone, increasing, continous on the right function with h(0) = 0, and let δ > 0. Define a set function H h δ (A) = inf { i h(diam(u i )) : A i U i, diam(u i ) δ }. Letting δ 0 yields an (outer) measure. Kolmogorov Complexity and Diophantine Approximation p. 11/24

Hausdorff Measures Caratheodory-Hausdorff construction on metric spaces: let A X, X some seperable metric space, h : R R a monotone, increasing, continous on the right function with h(0) = 0, and let δ > 0. Define a set function H h δ (A) = inf { i h(diam(u i )) : A i U i, diam(u i ) δ }. Letting δ 0 yields an (outer) measure. The h-dimensional Hausdorff measure H h is defined as H h (A) = lim δ 0 H h δ (A) Kolmogorov Complexity and Diophantine Approximation p. 11/24

Properties of Hausdorff Measures H h is Borel regular: all Borel sets are measurable and for every Y X there is a Borel set B Y such that H h (B) = H h (Y). Kolmogorov Complexity and Diophantine Approximation p. 12/24

Properties of Hausdorff Measures H h is Borel regular: all Borel sets are measurable and for every Y X there is a Borel set B Y such that H h (B) = H h (Y). An obvious choice for h is h(x) = x s for some s 0. For such h, denote the corresponding Hausdorff measure by H s. Kolmogorov Complexity and Diophantine Approximation p. 12/24

Properties of Hausdorff Measures H h is Borel regular: all Borel sets are measurable and for every Y X there is a Borel set B Y such that H h (B) = H h (Y). An obvious choice for h is h(x) = x s for some s 0. For such h, denote the corresponding Hausdorff measure by H s. For s = 1, H 1 is the usual Lebesgue measure λ on2 N. Kolmogorov Complexity and Diophantine Approximation p. 12/24

Properties of Hausdorff Measures H h is Borel regular: all Borel sets are measurable and for every Y X there is a Borel set B Y such that H h (B) = H h (Y). An obvious choice for h is h(x) = x s for some s 0. For such h, denote the corresponding Hausdorff measure by H s. For s = 1, H 1 is the usual Lebesgue measure λ on2 N. For 0 s < t < and Y X, H s (Y) < implies H t (Y) = 0, H t (Y) > 0 implies H s (Y) =. Kolmogorov Complexity and Diophantine Approximation p. 12/24

Hausdorff dimension The situation is depicted in the following graph: Kolmogorov Complexity and Diophantine Approximation p. 13/24

Hausdorff dimension The situation is depicted in the following graph: The Hausdorff dimension of A is dim H (A) = inf{s 0 : H s (A) = 0} = sup{t 0 : H t (A) = } Kolmogorov Complexity and Diophantine Approximation p. 13/24

Hausdorff Dimension and Diophantine Approximation Let W δ = W ψ for ψ(q) = 1 q δ. Kolmogorov Complexity and Diophantine Approximation p. 14/24

Hausdorff Dimension and Diophantine Approximation Let W δ = W ψ for ψ(q) = 1 q δ. So, the Liouville numbers are just the ones contained in every W n. Kolmogorov Complexity and Diophantine Approximation p. 14/24

Hausdorff Dimension and Diophantine Approximation Let W δ = W ψ for ψ(q) = 1 q δ. So, the Liouville numbers are just the ones contained in every W n. Jarnik-Besicovitch Theorem: For δ 2, dim H W δ = 2/δ. Kolmogorov Complexity and Diophantine Approximation p. 14/24

Hausdorff Dimension and Diophantine Approximation Let W δ = W ψ for ψ(q) = 1 q δ. So, the Liouville numbers are just the ones contained in every W n. Jarnik-Besicovitch Theorem: For δ 2, dim H W δ = 2/δ. In particular, the set of Liouville numbers has dimension zero. Kolmogorov Complexity and Diophantine Approximation p. 14/24

Random Continued Fractions Other questions: What does a typical continued fraction look like? Do the components reflect some approximation properties? Kolmogorov Complexity and Diophantine Approximation p. 15/24

Random Continued Fractions Other questions: What does a typical continued fraction look like? Do the components reflect some approximation properties? Interesting from a different point of view: What is the complexity of a function from natural numbers to natural numbers? Kolmogorov Complexity and Diophantine Approximation p. 15/24

Random Continued Fractions Other questions: What does a typical continued fraction look like? Do the components reflect some approximation properties? Interesting from a different point of view: What is the complexity of a function from natural numbers to natural numbers? In the following, identify an initial segment [a 1,...,a n ] of a continued fraction with the n-convergent p n q n = a 1 + 1 1 1 a 2 + a 3 +... 1 a n. Kolmogorov Complexity and Diophantine Approximation p. 15/24

Random Continued Fractions Using the ergodic theorem, one can show, for instance, that 1 n log q n π2 12 log 2 for almost every α. Kolmogorov Complexity and Diophantine Approximation p. 16/24

Random Continued Fractions Using the ergodic theorem, one can show, for instance, that for almost every α. 1 n log q n Entropy of the Gauss map x 1 x π2 12 log 2 mod 1. Kolmogorov Complexity and Diophantine Approximation p. 16/24

Measure on Baire Space Basic open cylinders: convergents of continued fractions, [a 0,...a n ]. Kolmogorov Complexity and Diophantine Approximation p. 17/24

Measure on Baire Space Basic open cylinders: convergents of continued fractions, [a 0,...a n ]. The diameter of these (as a subset of [0,1]) can be determined: diam[a 0,...a n ] = 1 q n (q n + q n 1 ) Kolmogorov Complexity and Diophantine Approximation p. 17/24

Measure on Baire Space Basic open cylinders: convergents of continued fractions, [a 0,...a n ]. The diameter of these (as a subset of [0,1]) can be determined: diam[a 0,...a n ] = 1 q n (q n + q n 1 ) Gives rise to a Borel measure in the usual way (extension from algebra to σ-algebra). Equivalently, Hausdorff measures. Kolmogorov Complexity and Diophantine Approximation p. 17/24

Effective (Hausdorff) measure Introduce effective coverings and define the notion of effective H s -measure 0. Kolmogorov Complexity and Diophantine Approximation p. 18/24

Effective (Hausdorff) measure Introduce effective coverings and define the notion of effective H s -measure 0. Let X = 2 N or X = N N, let s 0 be rational. A X is Σ 1 -H s null, Σ 0 1 -Hs (A) = 0, if there is a recursive sequence (C n ) of enumerable sets such that for each n, A w C n [w] and w C n diam[w] < 2 n. Kolmogorov Complexity and Diophantine Approximation p. 18/24

Effective (Hausdorff) measure Introduce effective coverings and define the notion of effective H s -measure 0. Let X = 2 N or X = N N, let s 0 be rational. A X is Σ 1 -H s null, Σ 0 1 -Hs (A) = 0, if there is a recursive sequence (C n ) of enumerable sets such that for each n, A w C n [w] and w C n diam[w] < 2 n. For s = 1, one obtains an effective vesion of Lebesgue measure λ. Kolmogorov Complexity and Diophantine Approximation p. 18/24

Effective (Hausdorff) measure Introduce effective coverings and define the notion of effective H s -measure 0. Let X = 2 N or X = N N, let s 0 be rational. A X is Σ 1 -H s null, Σ 0 1 -Hs (A) = 0, if there is a recursive sequence (C n ) of enumerable sets such that for each n, A w C n [w] and w C n diam[w] < 2 n. For s = 1, one obtains an effective vesion of Lebesgue measure λ. Theorem: (Schnorr) A sequence ξ 2 N is random if and only if {ξ} is not Σ 1 -λ null. Kolmogorov Complexity and Diophantine Approximation p. 18/24

Random Continued Fractions Does a similar characterization hold for continued fractions? Kolmogorov Complexity and Diophantine Approximation p. 19/24

Random Continued Fractions Does a similar characterization hold for continued fractions? Theorem: (Reimann, Gacs) A continued fraction α is not Σ 1 -λ-null if and only if sup{ K( a 1,...,a n ) log λ[a 1,...,a n ]} < n Kolmogorov Complexity and Diophantine Approximation p. 19/24

Random Continued Fractions Is the real represented by a random binary sequence (via its dyadic expansion) also a random continued fraction (in terms of Σ 1 -measure)? Kolmogorov Complexity and Diophantine Approximation p. 20/24

Random Continued Fractions Is the real represented by a random binary sequence (via its dyadic expansion) also a random continued fraction (in terms of Σ 1 -measure)? Problem: The continued fraction expansion might code things more efficiently. Kolmogorov Complexity and Diophantine Approximation p. 20/24

Random Continued Fractions Surprisingly, it does not. Theorem: (Lochs) For ξ 2 N, denote by π n (ξ) the number of partial convergents a i of the continued fraction expansion of ξ obtained from the first n digits of ξ. Then it holds for almost every ξ, lim n π n (ξ) n = 6 log2 2 π 2. Kolmogorov Complexity and Diophantine Approximation p. 21/24

Random Continued Fractions Surprisingly, it does not. Theorem: (Lochs) For ξ 2 N, denote by π n (ξ) the number of partial convergents a i of the continued fraction expansion of ξ obtained from the first n digits of ξ. Then it holds for almost every ξ, lim n π n (ξ) n = 6 log2 2 π 2. Lochs Theorem holds effectively, that is, for any random ξ. Kolmogorov Complexity and Diophantine Approximation p. 21/24

Random Continued Fractions Surprisingly, it does not. Theorem: (Lochs) For ξ 2 N, denote by π n (ξ) the number of partial convergents a i of the continued fraction expansion of ξ obtained from the first n digits of ξ. Then it holds for almost every ξ, lim n π n (ξ) n = 6 log2 2 π 2. Lochs Theorem holds effectively, that is, for any random ξ. Proof requires some effort to avoid the use of the ergodic theorem (which is not an effective law of probability). Kolmogorov Complexity and Diophantine Approximation p. 21/24

Random Continued Fractions Every random continued fraction satisfies 1 n log q n π2 12 log 2. Kolmogorov Complexity and Diophantine Approximation p. 22/24

Random Continued Fractions Every random continued fraction satisfies 1 n log q n π2 12 log 2. The proof also yields that a random c.f. α = [a 1,a 2,...] must have arbitrary large partial convergents a i. Kolmogorov Complexity and Diophantine Approximation p. 22/24

Random Continued Fractions Every random continued fraction satisfies 1 n log q n π2 12 log 2. The proof also yields that a random c.f. α = [a 1,a 2,...] must have arbitrary large partial convergents a i. Theorem: An irrational α is badly approximable if and only if its continued fraction expansion is bounded. Kolmogorov Complexity and Diophantine Approximation p. 22/24

Random Continued Fractions Every random continued fraction satisfies 1 n log q n π2 12 log 2. The proof also yields that a random c.f. α = [a 1,a 2,...] must have arbitrary large partial convergents a i. Theorem: An irrational α is badly approximable if and only if its continued fraction expansion is bounded. Hence, from our point of view, badly approximable numbers must be compressible. Kolmogorov Complexity and Diophantine Approximation p. 22/24

Effective Hausdorff Dimension Based on effective measure, we can also define effective Hausdorff dimension: dim 1 A = inf{s 0 : Σ 0 1 -Hs A = 0}. Kolmogorov Complexity and Diophantine Approximation p. 23/24

Effective Hausdorff Dimension Based on effective measure, we can also define effective Hausdorff dimension: dim 1 A = inf{s 0 : Σ 0 1 -Hs A = 0}. Yields a notion of Hausdorff dimension for individual sequences. Kolmogorov Complexity and Diophantine Approximation p. 23/24

Effective Hausdorff Dimension Based on effective measure, we can also define effective Hausdorff dimension: dim 1 A = inf{s 0 : Σ 0 1 -Hs A = 0}. Yields a notion of Hausdorff dimension for individual sequences. It turns out that the effective dimension of a set is determined by the dimension of its most complex members: dim 1 A = sup ξ A dim 1 ξ. Kolmogorov Complexity and Diophantine Approximation p. 23/24

Effective Hausdorff Dimension Based on effective measure, we can also define effective Hausdorff dimension: dim 1 A = inf{s 0 : Σ 0 1 -Hs A = 0}. Yields a notion of Hausdorff dimension for individual sequences. It turns out that the effective dimension of a set is determined by the dimension of its most complex members: dim 1 A = sup ξ A dim 1 ξ. Theorem: For any sequence ξ 2 N it holds that dim 1 ξ = lim inf n K(ξ n ) n. Kolmogorov Complexity and Diophantine Approximation p. 23/24

Badly approximable numbers Badly approximable numbers are compressible. Kolmogorov Complexity and Diophantine Approximation p. 24/24

Badly approximable numbers Badly approximable numbers are compressible. Jarnik s Theorem: Let E n = {α = [a 0,a 1,...] [0,1] : ia i n}. Then, for n 8, 1 4 nlog 2 dim H E n 1 1 8nlog n. Kolmogorov Complexity and Diophantine Approximation p. 24/24

Badly approximable numbers Badly approximable numbers are compressible. Jarnik s Theorem: Let E n = {α = [a 0,a 1,...] [0,1] : ia i n}. Then, for n 8, 1 4 nlog 2 dim H E n 1 1 8nlog n. Later, Bumby and Hensley gave good approximations of dim H E n for smaller values of n, e.g. dim H E 2 = 0.5312050.... Kolmogorov Complexity and Diophantine Approximation p. 24/24

Badly approximable numbers Badly approximable numbers are compressible. Jarnik s Theorem: Let E n = {α = [a 0,a 1,...] [0,1] : ia i n}. Then, for n 8, 1 4 nlog 2 dim H E n 1 1 8nlog n. Later, Bumby and Hensley gave good approximations of dim H E n for smaller values of n, e.g. dim H E 2 = 0.5312050.... Using complexity theoretic characterization of dimension, we can give a simpler, essentially combinatorial proof. Kolmogorov Complexity and Diophantine Approximation p. 24/24