Effective randomness and computability

Size: px
Start display at page:

Download "Effective randomness and computability"

Transcription

1 University of Wisconsin October 2009

2 What is this about? Let s begin by examining the title: Effective randomness (from algorithmic point of view) Computability (study of the limits of algorithms)

3 Algorithms Etymology: Al-Khwā-rizmī, Persian astronomer and mathematician. He wrote a treatise in 825 AD, On Calculation with Hindu Numerals" The Latin translation is Algoritmi de numero Indorum" There is no generally accepted formal definition of "algorithm" What we intuitively mean is there is a mechanical procedure (devoid of intelligence), and gives the desired result after a finite number of steps. Notice the word finite". I will try and give an overview of the subject, and talk about some of my own work.

4 Algorithms Etymology: Al-Khwā-rizmī, Persian astronomer and mathematician. He wrote a treatise in 825 AD, On Calculation with Hindu Numerals" The Latin translation is Algoritmi de numero Indorum" There is no generally accepted formal definition of "algorithm" What we intuitively mean is there is a mechanical procedure (devoid of intelligence), and gives the desired result after a finite number of steps. Notice the word finite". I will try and give an overview of the subject, and talk about some of my own work.

5 Algorithms Etymology: Al-Khwā-rizmī, Persian astronomer and mathematician. He wrote a treatise in 825 AD, On Calculation with Hindu Numerals" The Latin translation is Algoritmi de numero Indorum" There is no generally accepted formal definition of "algorithm" What we intuitively mean is there is a mechanical procedure (devoid of intelligence), and gives the desired result after a finite number of steps. Notice the word finite". I will try and give an overview of the subject, and talk about some of my own work.

6 Algorithms

7 Algorithms

8 Algorithms In these cases you specify an input, or set of ingredients. The algorithm applies a mechanical method to get the desired result. Euclid s algorithm for finding greatest common divisor: Input: a pair of numbers (1001,357) = = = = 7 10 Output: the gcd(1001,357)= 7

9 Algorithms In these cases you specify an input, or set of ingredients. The algorithm applies a mechanical method to get the desired result. Euclid s algorithm for finding greatest common divisor: Input: a pair of numbers (1001,357) = = = = 7 10 Output: the gcd(1001,357)= 7

10 Into the 20 th century David Hilbert had a grand plan to finitely mechanize" all of mathematics. Based on the idea that in mathematics there should be no "ignorabimus" (statement that the truth can never be known), A machine, which you can feed Input: a statement about mathematics Process: the machine uses a reasonable formal system to generate proofs" Output: True or False. Equivalently, can you have a mechanical procedure that enumerates" all the truths in a system (e.g. number theory)?

11 Into the 20 th century David Hilbert had a grand plan to finitely mechanize" all of mathematics. Based on the idea that in mathematics there should be no "ignorabimus" (statement that the truth can never be known), A machine, which you can feed Input: a statement about mathematics Process: the machine uses a reasonable formal system to generate proofs" Output: True or False. Equivalently, can you have a mechanical procedure that enumerates" all the truths in a system (e.g. number theory)?

12 Into the 20 th century

13 Into the 20 th century Gödel proved his two famous Incompleteness Theorems. First Incompleteness Theorem: Any sufficiently strong formal system of axioms has a statement P for which neither P nor P can be proven. Furthermore if you add P to the system, there will still be another statement P independent from the augmented system. Second Incompleteness Theorem: Any such system of axioms cannot prove the statement I am consistent", unless it is itself inconsistent. The collective intuition of generations of mathematicians were wrong.

14 Into the 20 th century Gödel proved his two famous Incompleteness Theorems. First Incompleteness Theorem: Any sufficiently strong formal system of axioms has a statement P for which neither P nor P can be proven. Furthermore if you add P to the system, there will still be another statement P independent from the augmented system. Second Incompleteness Theorem: Any such system of axioms cannot prove the statement I am consistent", unless it is itself inconsistent. The collective intuition of generations of mathematicians were wrong.

15 Into the 20 th century

16 Independence Some systems are decidable. For example, real closed fields, Euclidean geometry. Alfred Tarski and quantifier elimination. Gödel s example was artificial. Are there statements which matter in working mathematics? Yes... from Peano s arithmetic (PA).

17 Independence Classically Kruskal s Tree Theorem states that the set of finite trees under homeomorphic embedding is a well-quasi-ordering (i.e. no infinite anti-chain). Friedman noted a special case of this is independent of PA. For all n there is a k so large such that if {T i : i < k} are finite trees such that T i = n + i, then there is a pair where T i embeds into T j. We can state this in PA but you need very strong induction (beyond PA) to prove it.

18 Independence Another example is the Goodstein sequence" shown by Kirby and Paris to be independent of PA. This was the analogy given by Kirby and Paris: The "Hydra" is a rooted tree, and a move consists of cutting off one of its "heads" (a branch of the tree), to which the hydra responds by growing a finite number of new heads according to certain rules. The theorem says that the Hydra will eventually be killed, regardless of the strategy that Hercules uses to chop off its heads, though this may take a very, very long time.

19 Independence Another example is the Goodstein sequence" shown by Kirby and Paris to be independent of PA. Start with a number say 4 and write it in base 2: 2 2 = 4 Replace the base 2 with 3 and subtract 1: = = 26 Replace base 3 with 4 and subtract 1: = 41 Amazingly, every such sequence converges to 0!

20 Independence Another example is the Goodstein sequence" shown by Kirby and Paris to be independent of PA. Start with a number say 4 and write it in base 2: 2 2 = 4 Replace the base 2 with 3 and subtract 1: = = 26 Replace base 3 with 4 and subtract 1: = 41 Amazingly, every such sequence converges to 0!

21 Independence Another example is the Goodstein sequence" shown by Kirby and Paris to be independent of PA. Start with a number say 4 and write it in base 2: 2 2 = 4 Replace the base 2 with 3 and subtract 1: = = 26 Replace base 3 with 4 and subtract 1: = 41 Amazingly, every such sequence converges to 0!

22 Independence Another example is the Goodstein sequence" shown by Kirby and Paris to be independent of PA. Start with a number say 4 and write it in base 2: 2 2 = 4 Replace the base 2 with 3 and subtract 1: = = 26 Replace base 3 with 4 and subtract 1: = 41 Amazingly, every such sequence converges to 0!

23 Independence Kirby-Paris showed this cannot be proved nor refuted from PA. Need transfinite induction. During this time, various people were working on formalizing what we mean by algorithms" and mechanical method"... There were several notable models..

24 Formalizing computations Stephen C. Kleene µ-recursive functions Alonzo Church λ calculus

25 Formalizing computations Alan Turing Turing machines A Turing machine

26 Formalizing computations

27 Formalizing computations The fact that these models were all the same, lent support to the Church-Turing thesis: All mechanical and intuitively computable processes can be simulated on a Turing machine." Turing worked on code breaking during WW2 (Enigma machine) His fundamental paper was part of the inspiration for the first computers.

28 Formalizing computations Enigma A Turing machine made of Lego

29 Formalizing computations Small-Scale Experimental Machine, known as "Baby". University of Manchester, June 21st It was the first machine that could store data electronically.

30 Formalizing computations Not every set of natural numbers was computable, the most notable example is the Halting problem. Input: A pair of numbers e, x. Output: To tell whether the e th TM on input x halts. This is not computable. Code this set into others, to get other non-computable sets.

31 Formalizing computations Hilbert s tenth problem. Input: A polynomial p(x 1,, x n ) with integer coefficients. Output: To tell if p(x 1,, x n ) = 0 has integer solutions. There is no computable process to decide such problems (Matiyasevich, after Julia Robinson). Recently Braverman and Yampolsky showed that Julia sets can be non-computable, by coding the Halting problem. Many mathematical objects can be coded and used to simulate computations.

32 Formalizing computations Hilbert s tenth problem. Input: A polynomial p(x 1,, x n ) with integer coefficients. Output: To tell if p(x 1,, x n ) = 0 has integer solutions. There is no computable process to decide such problems (Matiyasevich, after Julia Robinson). Recently Braverman and Yampolsky showed that Julia sets can be non-computable, by coding the Halting problem. Many mathematical objects can be coded and used to simulate computations.

33 Measuring unsolvability We want to measure how impossible it is to compute a set, relative to other sets. Give two sets A, B N, we say A T B if whenever given a way to solve B, we have a way of solving A. The equivalence classes are called Turing degrees. A degree is computably enumerable (c.e.) if it contains the Halting set of some machine.

34 Measuring unsolvability Since there are only countably many ways of measuring relative unsolvability, there are continuum many Turing degrees. Structurally the Turing degrees form an upper-semilattice with minimal elements (Spector, Sacks). The c.e. Turing degrees were at some point very well studied. A motivation for looking at c.e. degrees are related to decidability of theories and formal systems.

35 C.e. degrees (Post s problem) Are there any c.e. Turing degree a which is not computable, yet does not compute the Halting problem? Friedberg, Muchnik (1956) developed a new important technique, the priority method. The structure of c.e. degrees is dense (Sacks 1962), and in fact much more complicated than originally thought. Recently Downey, Hirschfeldt, Nies, Stephan proved a priority-free, natural solution to Post s problem. From effective randomness

36 Feebleness If you relativize the construction of the Halting problem to a set X, you get the jump operator taking X X. In an influential paper of Kleene and Post, they asked about the range of this operator. Work of Friedberg, Shoenfield and Sacks collectively showed the range is the largest possible (even on restricted domains). Fundamental operator.

37 Feebleness The jump operator gives a way of measuring the computational feebleness of a set. A set A is low if A T - useless as an oracle. Recent work of Downey, Hirschfeldt, Nies have shown nice relationships of low sets with effective randomness.

38 What makes a string random? A real is a member of Cantor space 2 N with topology with basic clopen sets [σ] = {σα : α 2 ω }. Its measure is µ([σ]) = 2 σ. Strings = members of 2 <N = {0, 1}. We want to try and see which infinite binary strings are random.

39 What makes a string random? Which of the following sequences seem random? Not random: Sequence of zeroes A B C D E F

40 What makes a string random? Which of the following sequences seem random? A B C D E F

41 What makes a string random? Which of the following sequences seem random? Not random: 001 and 101 repeated A B C D E F

42 What makes a string random? Which of the following sequences seem random? A B C D E F

43 What makes a string random? Which of the following sequences seem random? Random: Sequence from random coin tosses A B C D E F

44 What makes a string random? Which of the following sequences seem random? A B C D E F

45 What makes a string random? Which of the following sequences seem random? Not random: 0,1,2,3,4,5 in binary A B C D E F

46 What makes a string random? Which of the following sequences seem random? A B C D E F

47 What makes a string random? Which of the following sequences seem random? Semi-random: Odd digits 0, Even digits coin tosses A B C D E F

48 What makes a string random? Which of the following sequences seem random? A B C D E F

49 What makes a string random? Which of the following sequences seem random? Not-random: Binary expansion of every other digit of π A B C D E F

50 What makes a string random? In many of these cases, the string is non-random because we can easily describe it / predict the next digit. In terms of probability and measure theory, these are all equally likely. No single element of the sample space can be random.. but how do we separate them? The first attempt was made by statistician von Mises 1919: to have an acceptable selection rule that generalizes the weak law of large numbers. If α = 0.a 0 a 1 a 2, then whenever we select a subsequence via the selection rule, the number of n where a f (n) = 1 should be asymptotically 1 2.

51 What makes a string random? In many of these cases, the string is non-random because we can easily describe it / predict the next digit. In terms of probability and measure theory, these are all equally likely. No single element of the sample space can be random.. but how do we separate them? The first attempt was made by statistician von Mises 1919: to have an acceptable selection rule that generalizes the weak law of large numbers. If α = 0.a 0 a 1 a 2, then whenever we select a subsequence via the selection rule, the number of n where a f (n) = 1 should be asymptotically 1 2.

52 What makes a string random? What would these acceptable selection rules be...? With the development of computability, Church linked these rules to computable functions. Take all computable stochastic properties. As pointed out by Ville, this was not good enough. He showed there were reals which passed all such selection rules, yet look inuitively non-random. Eventually Martin-Löf hit upon the idea of using effectively presented sets of Lebesgue measure 0, called Martin-Löf tests. A real is ML-random if it does not belong to any of these effectively presented statistical tests.

53 What makes a string random? What would these acceptable selection rules be...? With the development of computability, Church linked these rules to computable functions. Take all computable stochastic properties. As pointed out by Ville, this was not good enough. He showed there were reals which passed all such selection rules, yet look inuitively non-random. Eventually Martin-Löf hit upon the idea of using effectively presented sets of Lebesgue measure 0, called Martin-Löf tests. A real is ML-random if it does not belong to any of these effectively presented statistical tests.

54 What makes a string random? There are three main approaches to defining algorithmic randomness. We just gave the first: (I) Statistician s approach: A ML-random real possesses no algorithmically distinguishable trait (e.g ). Deals with rare patterns using measure theory. (II) Coder s approach: Since algorithmically distinguishable traits can be used to compress information, a random string should be incompressible. Eg. Text file can be zipped up to 50%, but a Jpeg file can hardly be compressed.

55 What makes a string random? There are three main approaches to defining algorithmic randomness. We just gave the first: (I) Statistician s approach: A ML-random real possesses no algorithmically distinguishable trait (e.g ). Deals with rare patterns using measure theory. (II) Coder s approach: Since algorithmically distinguishable traits can be used to compress information, a random string should be incompressible. Eg. Text file can be zipped up to 50%, but a Jpeg file can hardly be compressed.

56 What makes a string random? There are three main approaches to defining algorithmic randomness. We just gave the first: (I) Statistician s approach: A ML-random real possesses no algorithmically distinguishable trait (e.g ). Deals with rare patterns using measure theory. (II) Coder s approach: Since algorithmically distinguishable traits can be used to compress information, a random string should be incompressible. Eg. Text file can be zipped up to 50%, but a Jpeg file can hardly be compressed.

57 (II): The coder s approach Another example: We need to know (i) the pattern 01 (ii) the length 20 (log 20 bits) altogether 2 + log 20 bits 20. To output (sequence of coin tosses), we need to hardwire it into our system.

58 (II): The coder s approach Another example: We need to know (i) the pattern 01 (ii) the length 20 (log 20 bits) altogether 2 + log 20 bits 20. To output (sequence of coin tosses), we need to hardwire it into our system.

59 (II): The coder s approach Mandelbrot set fractal Storing the colours in each pixel requires 1.62 million bits Storing its generating program requires much less resources

60 (II): The coder s approach To formalize this, take a fixed machine M, and define: The plain complexity C(σ) of a string σ 2 <N is the length of the shortest τ where M(τ) converges and gives σ (due to Kolmogorov). Kolmogorov showed that universal machines exist, i.e. a machine U such that for every other machine M, C U (σ) C M (σ) + O(1). You might be tempted to say that a real X is random if for every n, C(X n) n + O(1), where X n denotes the first n bits of X.

61 (II): The coder s approach To formalize this, take a fixed machine M, and define: The plain complexity C(σ) of a string σ 2 <N is the length of the shortest τ where M(τ) converges and gives σ (due to Kolmogorov). Kolmogorov showed that universal machines exist, i.e. a machine U such that for every other machine M, C U (σ) C M (σ) + O(1). You might be tempted to say that a real X is random if for every n, C(X n) n + O(1), where X n denotes the first n bits of X.

62 (II): The coder s approach Unfortunately, Theorem (Martin-Löf ) There is no real X such that C(X n) n + O(1) for every n. The trick is to observe that any finite binary string is associated with a number - number off 2 <N from left to right. Eg 0 1, 1 2, 00 3, 01 4, 10 5, etc

63 (II): The coder s approach Unfortunately, Theorem (Martin-Löf ) There is no real X such that C(X n) n + O(1) for every n. The trick is to observe that any finite binary string is associated with a number - number off 2 <N from left to right. Eg 0 1, 1 2, 00 3, 01 4, 10 5, etc

64 (II): The coder s approach Consider the machine M that does the following. M takes the string σ and computes the string τ where τ is the σ th string to be named. It then outputs τσ. For any n, we can look at the number k where α n is the k th string to be named. Then let τ be the next k bits of X. I.e. τ = X(n + 1)X(n + 2) X(n + m). Then M(τ) = (X n)τ = X n + m. Hence C(X n + m) m + O(1). Infinitely many segments of X can be compressed.

65 (II): The coder s approach Consider the machine M that does the following. M takes the string σ and computes the string τ where τ is the σ th string to be named. It then outputs τσ. For any n, we can look at the number k where α n is the k th string to be named. Then let τ be the next k bits of X. I.e. τ = X(n + 1)X(n + 2) X(n + m). Then M(τ) = (X n)τ = X n + m. Hence C(X n + m) m + O(1). Infinitely many segments of X can be compressed.

66 (II): The coder s approach Consider the machine M that does the following. M takes the string σ and computes the string τ where τ is the σ th string to be named. It then outputs τσ. For any n, we can look at the number k where α n is the k th string to be named. Then let τ be the next k bits of X. I.e. τ = X(n + 1)X(n + 2) X(n + m). Then M(τ) = (X n)τ = X n + m. Hence C(X n + m) m + O(1). Infinitely many segments of X can be compressed.

67 (II): The coder s approach Under this system, the length of a string is used to give extra non-trivial information. To avoid this, (Chaitin, Levin, Schnorr) looked at specific machines where the domain is an anti-chain under string extension. The prefixfree complexity K (σ) is the length of the shortest string τ where M(τ) = σ and M is a universal prefix free machine. Then there are reals X such that K (X n) n + O(1), formalizing the notion of random in terms of incompressible. Schnorr showed that approaches I and II were the same!

68 (II): The coder s approach Under this system, the length of a string is used to give extra non-trivial information. To avoid this, (Chaitin, Levin, Schnorr) looked at specific machines where the domain is an anti-chain under string extension. The prefixfree complexity K (σ) is the length of the shortest string τ where M(τ) = σ and M is a universal prefix free machine. Then there are reals X such that K (X n) n + O(1), formalizing the notion of random in terms of incompressible. Schnorr showed that approaches I and II were the same!

69 (III): The gambler s approach The third approach to calibrating randomness is through the intuition that you should not be able to make arbitrarily much money when trying to predict the digits of a random string. Suppose you walk into a casino, with a certain amount of money (say $10). The manager has in his pocket the digits of a real X (which you don t know, naturally) At the n th round, you are given X n. You have to try and guess X(n), the next digit. You decide a weight p 2. Assign p to X(n) = 0 and 2 p to X(n) = 1

70 (III): The gambler s approach The third approach to calibrating randomness is through the intuition that you should not be able to make arbitrarily much money when trying to predict the digits of a random string. Suppose you walk into a casino, with a certain amount of money (say $10). The manager has in his pocket the digits of a real X (which you don t know, naturally) At the n th round, you are given X n. You have to try and guess X(n), the next digit. You decide a weight p 2. Assign p to X(n) = 0 and 2 p to X(n) = 1

71 (III): The gambler s approach The third approach to calibrating randomness is through the intuition that you should not be able to make arbitrarily much money when trying to predict the digits of a random string. Suppose you walk into a casino, with a certain amount of money (say $10). The manager has in his pocket the digits of a real X (which you don t know, naturally) At the n th round, you are given X n. You have to try and guess X(n), the next digit. You decide a weight p 2. Assign p to X(n) = 0 and 2 p to X(n) = 1

72 (III): The gambler s approach The manager then reveals the next digit X(n) to you. Your capital C n is pc n 1 if X(n) = 0 and (2 p)c n 1 if X(n) = 1, where C n is your stage n capital. You win if in the limit, if your capital. A real X is random if you cannot win against X using only certain effective betting strategies. For example it is easy to win against the sequence (Schnorr) The gambler s approach III give exactly the same class as I and II.

73 (III): The gambler s approach The manager then reveals the next digit X(n) to you. Your capital C n is pc n 1 if X(n) = 0 and (2 p)c n 1 if X(n) = 1, where C n is your stage n capital. You win if in the limit, if your capital. A real X is random if you cannot win against X using only certain effective betting strategies. For example it is easy to win against the sequence (Schnorr) The gambler s approach III give exactly the same class as I and II.

74 (III): The gambler s approach The manager then reveals the next digit X(n) to you. Your capital C n is pc n 1 if X(n) = 0 and (2 p)c n 1 if X(n) = 1, where C n is your stage n capital. You win if in the limit, if your capital. A real X is random if you cannot win against X using only certain effective betting strategies. For example it is easy to win against the sequence (Schnorr) The gambler s approach III give exactly the same class as I and II.

75 An example of a ML-random real The most famous example is Ω = µ(domu) = U(σ) 2 σ, where U is the universal prefixfree machine. Ω is a left-c.e. real, i.e. there is a computable increasing sequence of rationals q 0 < q 1 < q 2 Ω In fact any left-c.e. random real is µ(domm) for some prefixfree machine M. Analogues of c.e. sets.

76 An example of a ML-random real Theorem (Chaitin) Ω is ML-random. Sketch of proof. We can look at the stage s approximation of Ω s. We build a prefixfree machine M. Whenever we see K s (Ω s n) drops below n O(1), then we make M(τ) converge on some τ of length roughly n. This has to be reflected by Ω = µ(dom(u)) and U is universal. So, Ω s n Ω n has to increase.

77 Class of random reals There are lots of ML-random reals, and has measure 1. They are all contained in a Σ 0 2 class {X : c n K (X n) > n c} Hence there are randoms with low Turing degree, and hyperimmune-free degree. On the other hand they combine nicely with the jump operator: (Kučera) The class of ML-random have all possible jumps (Downey, Miller) The class of ML-random computable from the Halting problem,, also have all possible jumps

78 Class of random reals There are lots of ML-random reals, and has measure 1. They are all contained in a Σ 0 2 class {X : c n K (X n) > n c} Hence there are randoms with low Turing degree, and hyperimmune-free degree. On the other hand they combine nicely with the jump operator: (Kučera) The class of ML-random have all possible jumps (Downey, Miller) The class of ML-random computable from the Halting problem,, also have all possible jumps

79 My work What other ways are there of measuring randomness? What level of randomness is needed for different applications? How does one measure relative randomness? What are the ways to extend the concept of feebleness to randomness notions? What do they have to do with feebleness in computability theory? How do randomness and computability interact? Must random reals be computationally powerful?

80 The computational strength of random reals We expect not to be able to effectively extract a lot of coherent data from a random real. E.g. How do we effectively (mechanically) extract useful data out of random coin tosses? So, random reals should not be computationally powerful, in terms of classical computability theoretic notions. Unfortunately Kučera and Gács proved that any real X 2 N can be computed from a ML-random real R (i.e. X T R). ML-randoms can contain as much non-trivial information as we want!

81 The computational strength of random reals We expect not to be able to effectively extract a lot of coherent data from a random real. E.g. How do we effectively (mechanically) extract useful data out of random coin tosses? So, random reals should not be computationally powerful, in terms of classical computability theoretic notions. Unfortunately Kučera and Gács proved that any real X 2 N can be computed from a ML-random real R (i.e. X T R). ML-randoms can contain as much non-trivial information as we want!

82 The computational strength of random reals We expect not to be able to effectively extract a lot of coherent data from a random real. E.g. How do we effectively (mechanically) extract useful data out of random coin tosses? So, random reals should not be computationally powerful, in terms of classical computability theoretic notions. Unfortunately Kučera and Gács proved that any real X 2 N can be computed from a ML-random real R (i.e. X T R). ML-randoms can contain as much non-trivial information as we want!

83 The computational strength of random reals A very closely related notion... what s the easiest way of constructing a function f which is computationally non-trivial? List out all the Turing machines M 0, M 1, M 2 and let ϕ 0, ϕ 1, ϕ 2, be the functions simulated by the TMs. Define f to be different from each ϕ e, i.e. f (x e ) ϕ e (x e ) for some x e. A function f : N N is diagonally non-computable (d.n.c.) if for every e, f (e) ϕ e (e).

84 The computational strength of random reals A very closely related notion... what s the easiest way of constructing a function f which is computationally non-trivial? List out all the Turing machines M 0, M 1, M 2 and let ϕ 0, ϕ 1, ϕ 2, be the functions simulated by the TMs. Define f to be different from each ϕ e, i.e. f (x e ) ϕ e (x e ) for some x e. A function f : N N is diagonally non-computable (d.n.c.) if for every e, f (e) ϕ e (e).

85 The computational strength of random reals A d.n.c. function is non-computable, and is frequently much more than that. E.g. if a d.n.c. function f is of c.e. degree, then it computes the Halting problem f T. On the other hand, there are d.n.c. functions of low Turing degree. Longstanding question: Must a d.n.c. function always be strictly more than being non-computable? (Kumabe) There is a d.n.c. function f such that f computes nothing else other than itself and the computable sets.

86 The computational strength of random reals A d.n.c. function is non-computable, and is frequently much more than that. E.g. if a d.n.c. function f is of c.e. degree, then it computes the Halting problem f T. On the other hand, there are d.n.c. functions of low Turing degree. Longstanding question: Must a d.n.c. function always be strictly more than being non-computable? (Kumabe) There is a d.n.c. function f such that f computes nothing else other than itself and the computable sets.

87 The computational strength of random reals A d.n.c. function is non-computable, and is frequently much more than that. E.g. if a d.n.c. function f is of c.e. degree, then it computes the Halting problem f T. On the other hand, there are d.n.c. functions of low Turing degree. Longstanding question: Must a d.n.c. function always be strictly more than being non-computable? (Kumabe) There is a d.n.c. function f such that f computes nothing else other than itself and the computable sets.

88 The computational strength of random reals Another example that d.n.c. = computationally strong: take the class of d.n.c. functions whose range {0, 1}. (Jockusch, Soare) A binary valued function f is d.n.c. iff f computes a total extension of PA. Every f of PA degree (i.e. binary valued d.n.c.) computes a ML-random. Not true of every d.n.c. function (e.g. Kumabe). There have been increasing amounts of evidence that the bound of the range of a d.n.c. f is strongly related to prefixfree complexity.

89 The computational strength of random reals Another example that d.n.c. = computationally strong: take the class of d.n.c. functions whose range {0, 1}. (Jockusch, Soare) A binary valued function f is d.n.c. iff f computes a total extension of PA. Every f of PA degree (i.e. binary valued d.n.c.) computes a ML-random. Not true of every d.n.c. function (e.g. Kumabe). There have been increasing amounts of evidence that the bound of the range of a d.n.c. f is strongly related to prefixfree complexity.

90 The computational strength of random reals Another example that d.n.c. = computationally strong: take the class of d.n.c. functions whose range {0, 1}. (Jockusch, Soare) A binary valued function f is d.n.c. iff f computes a total extension of PA. Every f of PA degree (i.e. binary valued d.n.c.) computes a ML-random. Not true of every d.n.c. function (e.g. Kumabe). There have been increasing amounts of evidence that the bound of the range of a d.n.c. f is strongly related to prefixfree complexity.

91 The computational strength of random reals Theorem (Stephan) A ML-random real A computes the Halting problem iff it computes a binary valued d.n.c. function This result says there are only two kinds of ML-random reals: 1 The first kind resemble Ω, and are so smart that they know how to be stupid. 2 The second really are stupid (fail to compute a binary valued d.n.c.) Recent work (Franklin, Ng) have made the second class ( true" ML-randoms) more well-understood.

92 The computational strength of random reals Theorem (Stephan) A ML-random real A computes the Halting problem iff it computes a binary valued d.n.c. function This result says there are only two kinds of ML-random reals: 1 The first kind resemble Ω, and are so smart that they know how to be stupid. 2 The second really are stupid (fail to compute a binary valued d.n.c.) Recent work (Franklin, Ng) have made the second class ( true" ML-randoms) more well-understood.

93 The computational strength of random reals Theorem (Stephan) A ML-random real A computes the Halting problem iff it computes a binary valued d.n.c. function This result says there are only two kinds of ML-random reals: 1 The first kind resemble Ω, and are so smart that they know how to be stupid. 2 The second really are stupid (fail to compute a binary valued d.n.c.) Recent work (Franklin, Ng) have made the second class ( true" ML-randoms) more well-understood.

94 Variations on ML-randomness Two stronger forms of ML-randomness have been studied. The first, weak 2-randomness have to avoid every Π 0 2 -null class. Theorem (Downey, Nies, Weber, Yu) A is weakly 2-random iff A is random and contains no common information with the Halting problem. The second, 2-randomness also exhibit properties demonstrating weakness. Theorem Every 2-random real A is generalized low, i.e. A T A.

95 Variations on ML-randomness Two stronger forms of ML-randomness have been studied. The first, weak 2-randomness have to avoid every Π 0 2 -null class. Theorem (Downey, Nies, Weber, Yu) A is weakly 2-random iff A is random and contains no common information with the Halting problem. The second, 2-randomness also exhibit properties demonstrating weakness. Theorem Every 2-random real A is generalized low, i.e. A T A.

96 Variations on ML-randomness The class of weakly 2-randoms remains poorly understood. For example, is there a definition in terms of initial segment complexity? (Barmpalias, Downey, Ng) Unlike the ML-randoms, the weakly 2-randoms do not have all possible jumps. The jumps of weakly 2-randoms are very closely related to -domination and the functions d.n.c. relative to. Suppose you take two ML-random strings A and B. Then A B = {2n : n A} {2n + 1 : n B} is generally not ML-random.

97 Variations on ML-randomness The class of weakly 2-randoms remains poorly understood. For example, is there a definition in terms of initial segment complexity? (Barmpalias, Downey, Ng) Unlike the ML-randoms, the weakly 2-randoms do not have all possible jumps. The jumps of weakly 2-randoms are very closely related to -domination and the functions d.n.c. relative to. Suppose you take two ML-random strings A and B. Then A B = {2n : n A} {2n + 1 : n B} is generally not ML-random.

98 Variations on ML-randomness The class of weakly 2-randoms remains poorly understood. For example, is there a definition in terms of initial segment complexity? (Barmpalias, Downey, Ng) Unlike the ML-randoms, the weakly 2-randoms do not have all possible jumps. The jumps of weakly 2-randoms are very closely related to -domination and the functions d.n.c. relative to. Suppose you take two ML-random strings A and B. Then A B = {2n : n A} {2n + 1 : n B} is generally not ML-random.

99 Variations on ML-randomness (van Lambalgen) A B is ML-random iff A is random and B is random relative to A. (Barmpalias, Downey, Ng) The corresponding fact for weakly 2-randoms: ( ) fails but ( ) holds. Theorem (Barmpalias, Downey, Ng) Given any function f : N N there is a weakly 2-random A and some g T A where g is not dominated by f. Weakly 2-random reals are computationally weak (having no common information with but still strong enough to escape domination.

100 Variations on ML-randomness (van Lambalgen) A B is ML-random iff A is random and B is random relative to A. (Barmpalias, Downey, Ng) The corresponding fact for weakly 2-randoms: ( ) fails but ( ) holds. Theorem (Barmpalias, Downey, Ng) Given any function f : N N there is a weakly 2-random A and some g T A where g is not dominated by f. Weakly 2-random reals are computationally weak (having no common information with but still strong enough to escape domination.

101 Variations on ML-randomness A special case of this theorem gives that weakly 2-random reals can be array non-computable. This notion arises in the study of c.e. degrees. It has found use in an impressive array of areas within computability theory: lattice embeddings, computable analysis, c.e.a. operators, genericity. (Brodhead, Downey, Ng) Recently it was linked to randomness. A certain weakening of ML-randomness, called computably bounded randomness was characterized in terms of array computability amongst the c.e. degrees.

102 Hausdorff dimension Another way of measuring semi-randomness is looking at effective Hausdorff and packing dimensions. Classically, the Hausdorff dimension gives a way of refine sets of measure 0. The effective version of Hausdorff dimension has been studied in the work of Lutz, Mayordomo. For 0 < s 1, an s-gale is a function F : 2 <N R such that F(σ0) + F(σ1) F(σ) = 2 s These are betting strategies. When s = 1 it s what we talked about for ML-randomness.

103 Hausdorff dimension Another way of measuring semi-randomness is looking at effective Hausdorff and packing dimensions. Classically, the Hausdorff dimension gives a way of refine sets of measure 0. The effective version of Hausdorff dimension has been studied in the work of Lutz, Mayordomo. For 0 < s 1, an s-gale is a function F : 2 <N R such that F(σ0) + F(σ1) F(σ) = 2 s These are betting strategies. When s = 1 it s what we talked about for ML-randomness.

104 Hausdorff dimension in terms of s-gales Suppose now you return to the casino with $1 and plays against the real X in the manager s pocket. The casino has a new rule: for every round you stay in, it takes a fraction (depending on s) of your current capital. Previously (if s = 1) we could refrain from favouring one side simply be betting an equal amount on 0 and 1. Now if we do this we only get back 2 s 1 F(σ) < F(σ), if s < 1. Now it s much harder for you to win, because you need to have a lot more knowledge about X in order to win.

105 Hausdorff dimension in terms of s-gales Suppose now you return to the casino with $1 and plays against the real X in the manager s pocket. The casino has a new rule: for every round you stay in, it takes a fraction (depending on s) of your current capital. Previously (if s = 1) we could refrain from favouring one side simply be betting an equal amount on 0 and 1. Now if we do this we only get back 2 s 1 F(σ) < F(σ), if s < 1. Now it s much harder for you to win, because you need to have a lot more knowledge about X in order to win.

106 Hausdorff dimension in terms of s-gales Suppose now you return to the casino with $1 and plays against the real X in the manager s pocket. The casino has a new rule: for every round you stay in, it takes a fraction (depending on s) of your current capital. Previously (if s = 1) we could refrain from favouring one side simply be betting an equal amount on 0 and 1. Now if we do this we only get back 2 s 1 F(σ) < F(σ), if s < 1. Now it s much harder for you to win, because you need to have a lot more knowledge about X in order to win.

107 Hausdorff dimension in terms of s-gales Lutz showed that effective Hausdorff dimension can be characterized in terms of s-gales: Theorem (Lutz,Mayordomo) For a class X of reals the following are equivalent: dim(x)= s s = inf{s Q : for some s-gale F, X S[F ]}

108 Effective hausdorff dimension C.e. martingales give effective version of Hausdorff dimension. Remarkably, Theorem (Mayordomo) The effective Hausdorff dimension of a real A is lim inf n K (A n) n = (lim inf n C(A n) ) n For instance if Ω = 0.a 1 a 2 a 3 then 0.a 1 0a 2 0a 3 0 has effective Hausdorff dimension 1 2.

109 Effective hausdorff dimension C.e. martingales give effective version of Hausdorff dimension. Remarkably, Theorem (Mayordomo) The effective Hausdorff dimension of a real A is lim inf n K (A n) n = (lim inf n C(A n) ) n For instance if Ω = 0.a 1 a 2 a 3 then 0.a 1 0a 2 0a 3 0 has effective Hausdorff dimension 1 2.

110 Hausdorff dimension extraction Is this the only way to construct a real of effective Hausdorff dimension 1 2? A lot of work done on Hausdorff dimension extraction, and many lovely results (Greenberg, Miller, Reimann). Having effective Hausdorff dimension 1 is closely related to the d.n.c. functions. Some questions still remain, e.g. when can a degree compute another of effective Hausdorff dimension 1?

111 Extracting packing dimension Idea is to replace outer measure by inner measure. Look for a dense packing. Classically this is known as packing dimension. Athreya, Hitchcock, Lutz, Mayordomo also characterized packing dimension in terms of martingales. Can define the effective packing dimension. This is characterized as lim sup n K (α n) (= lim sup n n C(α n) ). n

112 Extracting packing dimension Fundamental question: what Turing degrees contain reals of high packing dimension? Work of Greenberg, Downey, Ng: Amongst the c.e. degrees, packing dimension 1 = array noncomputability, but it doesn t extend beyond the c.e. degrees. Theorem (Fortnow, Hitchcock, Aduri, Vinochandran, Wang) If α has packing dimension > 0, then for any ε > 0, β wtt α of packing dimension 1 ε. Hence for degrees a 0-1 Law for effective packing dimension (no broken dimension). Open Question: is there a real of effective packing dimension 1 inside each degree of packing dimension 1?

113 Extracting packing dimension Fundamental question: what Turing degrees contain reals of high packing dimension? Work of Greenberg, Downey, Ng: Amongst the c.e. degrees, packing dimension 1 = array noncomputability, but it doesn t extend beyond the c.e. degrees. Theorem (Fortnow, Hitchcock, Aduri, Vinochandran, Wang) If α has packing dimension > 0, then for any ε > 0, β wtt α of packing dimension 1 ε. Hence for degrees a 0-1 Law for effective packing dimension (no broken dimension). Open Question: is there a real of effective packing dimension 1 inside each degree of packing dimension 1?

114 The proof This proof is due to Bienvenu. Have n(c(α n) tn) for some t. Break α into pieces of exponential length α m 0, α m 1, α m 2, [m k, m k+1 ) a large number. If m k 1 n < m k, then C(α n) C(α m k ) + O(log n). Choose t small, C(α m k ) t m k for infinitely many k. We re going to swap these pieces α m k with more complex pieces τ 0, τ 1,.

115 The proof This proof is due to Bienvenu. Have n(c(α n) tn) for some t. Break α into pieces of exponential length α m 0, α m 1, α m 2, [m k, m k+1 ) a large number. If m k 1 n < m k, then C(α n) C(α m k ) + O(log n). Choose t small, C(α m k ) t m k for infinitely many k. We re going to swap these pieces α m k with more complex pieces τ 0, τ 1,.

116 The proof This proof is due to Bienvenu. Have n(c(α n) tn) for some t. Break α into pieces of exponential length α m 0, α m 1, α m 2, [m k, m k+1 ) a large number. If m k 1 n < m k, then C(α n) C(α m k ) + O(log n). Choose t small, C(α m k ) t m k for infinitely many k. We re going to swap these pieces α m k with more complex pieces τ 0, τ 1,.

117 The proof Now let s = lim sup k C(α m k ) m k t > 0. Swap α m k with τ k, where U(τ k ) = α m k, and demand that τ k s m k. What is C(τ k )? C(τ k ) C(α m k ) s.m k infinitely often. Hence C(τ k )/ τ k is infinitely often close to 1, by adjusting m and the tolerance in. The original proof was a bit different, gave polynomial time reductions using complex multisource extractors of Impagliazzo and Widgerson.

118 The proof Now let s = lim sup k C(α m k ) m k t > 0. Swap α m k with τ k, where U(τ k ) = α m k, and demand that τ k s m k. What is C(τ k )? C(τ k ) C(α m k ) s.m k infinitely often. Hence C(τ k )/ τ k is infinitely often close to 1, by adjusting m and the tolerance in. The original proof was a bit different, gave polynomial time reductions using complex multisource extractors of Impagliazzo and Widgerson.

119 The proof Now let s = lim sup k C(α m k ) m k t > 0. Swap α m k with τ k, where U(τ k ) = α m k, and demand that τ k s m k. What is C(τ k )? C(τ k ) C(α m k ) s.m k infinitely often. Hence C(τ k )/ τ k is infinitely often close to 1, by adjusting m and the tolerance in. The original proof was a bit different, gave polynomial time reductions using complex multisource extractors of Impagliazzo and Widgerson.

120 The proof Now let s = lim sup k C(α m k ) m k t > 0. Swap α m k with τ k, where U(τ k ) = α m k, and demand that τ k s m k. What is C(τ k )? C(τ k ) C(α m k ) s.m k infinitely often. Hence C(τ k )/ τ k is infinitely often close to 1, by adjusting m and the tolerance in. The original proof was a bit different, gave polynomial time reductions using complex multisource extractors of Impagliazzo and Widgerson.

121 Lots of other work Lowness in terms of randomness notions. Different combinatorial notions relating to these lowness notions, such as traceability and the diamond classes. Work on other randomness notions such as Schnorr and computable randomness. Many intermediate randomness notions defined using different kinds of betting strategies.

Kolmogorov-Loveland Randomness and Stochasticity

Kolmogorov-Loveland Randomness and Stochasticity Kolmogorov-Loveland Randomness and Stochasticity Wolfgang Merkle 1 Joseph Miller 2 André Nies 3 Jan Reimann 1 Frank Stephan 4 1 Institut für Informatik, Universität Heidelberg 2 Department of Mathematics,

More information

The Metamathematics of Randomness

The Metamathematics of Randomness The Metamathematics of Randomness Jan Reimann January 26, 2007 (Original) Motivation Effective extraction of randomness In my PhD-thesis I studied the computational power of reals effectively random for

More information

CALIBRATING RANDOMNESS ROD DOWNEY, DENIS R. HIRSCHFELDT, ANDRÉ NIES, AND SEBASTIAAN A. TERWIJN

CALIBRATING RANDOMNESS ROD DOWNEY, DENIS R. HIRSCHFELDT, ANDRÉ NIES, AND SEBASTIAAN A. TERWIJN CALIBRATING RANDOMNESS ROD DOWNEY, DENIS R. HIRSCHFELDT, ANDRÉ NIES, AND SEBASTIAAN A. TERWIJN Contents 1. Introduction 2 2. Sets, measure, and martingales 4 2.1. Sets and measure 4 2.2. Martingales 5

More information

Ten years of triviality

Ten years of triviality Ten years of triviality André Nies U of Auckland The Incomputable, Chicheley Hall, 2012 André Nies (U of Auckland) Ten years of triviality The Incomputable 1 / 19 K-trivials: synopsis During the last 10

More information

Algorithmic randomness and computability

Algorithmic randomness and computability Algorithmic randomness and computability Rod Downey Abstract. We examine some recent work which has made significant progress in out understanding of algorithmic randomness, relative algorithmic randomness

More information

Algorithmic Randomness

Algorithmic Randomness Algorithmic Randomness Rod Downey Victoria University Wellington New Zealand Darmstadt, 2018 Lets begin by examining the title: Algorithmic Randomness Algorithmic Etymology : Al-Khwarizmi, Persian astronomer

More information

Measures of relative complexity

Measures of relative complexity Measures of relative complexity George Barmpalias Institute of Software Chinese Academy of Sciences and Visiting Fellow at the Isaac Newton Institute for the Mathematical Sciences Newton Institute, January

More information

Randomness, probabilities and machines

Randomness, probabilities and machines 1/20 Randomness, probabilities and machines by George Barmpalias and David Dowe Chinese Academy of Sciences - Monash University CCR 2015, Heildeberg 2/20 Concrete examples of random numbers? Chaitin (1975)

More information

Randomness Beyond Lebesgue Measure

Randomness Beyond Lebesgue Measure Randomness Beyond Lebesgue Measure Jan Reimann Department of Mathematics University of California, Berkeley November 16, 2006 Measures on Cantor Space Outer measures from premeasures Approximate sets from

More information

Density-one Points of Π 0 1 Classes

Density-one Points of Π 0 1 Classes University of Wisconsin Madison Midwest Computability Seminar XIII, University of Chicago, October 1st, 2013 Outline 1 Definitions and observations 2 Dyadic density-one vs full density-one 3 What can density-one

More information

Density-one Points of Π 0 1 Classes

Density-one Points of Π 0 1 Classes University of Wisconsin Madison April 30th, 2013 CCR, Buenos Aires Goal Joe s and Noam s talks gave us an account of the class of density-one points restricted to the Martin-Löf random reals. Today we

More information

Shift-complex Sequences

Shift-complex Sequences University of Wisconsin Madison March 24th, 2011 2011 ASL North American Annual Meeting Berkeley, CA What are shift-complex sequences? K denotes prefix-free Kolmogorov complexity: For a string σ, K(σ)

More information

Using random sets as oracles

Using random sets as oracles Using random sets as oracles Denis R. Hirschfeldt André Nies Department of Mathematics Department of Computer Science University of Chicago University of Auckland Frank Stephan Departments of Computer

More information

Kolmogorov-Loveland Randomness and Stochasticity

Kolmogorov-Loveland Randomness and Stochasticity Kolmogorov-Loveland Randomness and Stochasticity Wolfgang Merkle 1, Joseph Miller 2, André Nies 3, Jan Reimann 1, and Frank Stephan 4 1 Universität Heidelberg, Heidelberg, Germany 2 Indiana University,

More information

Schnorr trivial sets and truth-table reducibility

Schnorr trivial sets and truth-table reducibility Schnorr trivial sets and truth-table reducibility Johanna N.Y. Franklin and Frank Stephan Abstract We give several characterizations of Schnorr trivial sets, including a new lowness notion for Schnorr

More information

Some new natural definable degrees. Rod Downey Victoria University Wellington New Zealand

Some new natural definable degrees. Rod Downey Victoria University Wellington New Zealand Some new natural definable degrees Rod Downey Victoria University Wellington New Zealand 1 References Joint work with Noam Greenberg Totally ω-computably enumerable degrees and bounding critical triples

More information

THE K-DEGREES, LOW FOR K DEGREES, AND WEAKLY LOW FOR K SETS

THE K-DEGREES, LOW FOR K DEGREES, AND WEAKLY LOW FOR K SETS THE K-DEGREES, LOW FOR K DEGREES, AND WEAKLY LOW FOR K SETS JOSEPH S. MILLER Abstract. We call A weakly low for K if there is a c such that K A (σ) K(σ) c for infinitely many σ; in other words, there are

More information

Topics in Algorithmic Randomness and Computability Theory

Topics in Algorithmic Randomness and Computability Theory Topics in Algorithmic Randomness and Computability Theory by Michael Patrick McInerney A thesis submitted to the Victoria University of Wellington in fulfilment of the requirements for the degree of Doctor

More information

Some results on algorithmic randomness and computability-theoretic strength

Some results on algorithmic randomness and computability-theoretic strength Some results on algorithmic randomness and computability-theoretic strength By Mushfeq Khan A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Mathematics)

More information

Recovering randomness from an asymptotic Hamming distance

Recovering randomness from an asymptotic Hamming distance Recovering randomness from an asymptotic Hamming distance Bjørn Kjos-Hanssen March 23, 2011, Workshop in Computability Theory @ U. San Francisco Recovering randomness from an asymptotic Hamming distance

More information

Kolmogorov Complexity and Diophantine Approximation

Kolmogorov Complexity and Diophantine Approximation Kolmogorov Complexity and Diophantine Approximation Jan Reimann Institut für Informatik Universität Heidelberg Kolmogorov Complexity and Diophantine Approximation p. 1/24 Algorithmic Information Theory

More information

arxiv:math/ v1 [math.nt] 17 Feb 2003

arxiv:math/ v1 [math.nt] 17 Feb 2003 arxiv:math/0302183v1 [math.nt] 17 Feb 2003 REPRESENTATIONS OF Ω IN NUMBER THEORY: FINITUDE VERSUS PARITY TOBY ORD AND TIEN D. KIEU Abstract. We present a new method for expressing Chaitin s random real,

More information

AN INTRODUCTION TO COMPUTABILITY THEORY

AN INTRODUCTION TO COMPUTABILITY THEORY AN INTRODUCTION TO COMPUTABILITY THEORY CINDY CHUNG Abstract. This paper will give an introduction to the fundamentals of computability theory. Based on Robert Soare s textbook, The Art of Turing Computability:

More information

Computability Theory, Algorithmic Randomness and Turing s Anticipation

Computability Theory, Algorithmic Randomness and Turing s Anticipation Computability Theory, Algorithmic Randomness and Turing s Anticipation Rod Downey School of Mathematics, Statistics and Operations Research, Victoria University of Wellington, PO Box 600, Wellington, New

More information

DRAFT. Algebraic computation models. Chapter 14

DRAFT. Algebraic computation models. Chapter 14 Chapter 14 Algebraic computation models Somewhat rough We think of numerical algorithms root-finding, gaussian elimination etc. as operating over R or C, even though the underlying representation of the

More information

RESEARCH STATEMENT: MUSHFEQ KHAN

RESEARCH STATEMENT: MUSHFEQ KHAN RESEARCH STATEMENT: MUSHFEQ KHAN Contents 1. Overview 1 2. Notation and basic definitions 4 3. Shift-complex sequences 4 4. PA degrees and slow-growing DNC functions 5 5. Lebesgue density and Π 0 1 classes

More information

SCHNORR DIMENSION RODNEY DOWNEY, WOLFGANG MERKLE, AND JAN REIMANN

SCHNORR DIMENSION RODNEY DOWNEY, WOLFGANG MERKLE, AND JAN REIMANN SCHNORR DIMENSION RODNEY DOWNEY, WOLFGANG MERKLE, AND JAN REIMANN ABSTRACT. Following Lutz s approach to effective (constructive) dimension, we define a notion of dimension for individual sequences based

More information

INDEPENDENCE, RELATIVE RANDOMNESS, AND PA DEGREES

INDEPENDENCE, RELATIVE RANDOMNESS, AND PA DEGREES INDEPENDENCE, RELATIVE RANDOMNESS, AND PA DEGREES ADAM R. DAY AND JAN REIMANN Abstract. We study pairs of reals that are mutually Martin-Löf random with respect to a common, not necessarily computable

More information

Hausdorff Measures and Perfect Subsets How random are sequences of positive dimension?

Hausdorff Measures and Perfect Subsets How random are sequences of positive dimension? Hausdorff Measures and Perfect Subsets How random are sequences of positive dimension? Jan Reimann Institut für Informatik, Universität Heidelberg Hausdorff Measures and Perfect Subsets p.1/18 Hausdorff

More information

THE IMPORTANCE OF Π 0 1 CLASSES IN EFFECTIVE RANDOMNESS.

THE IMPORTANCE OF Π 0 1 CLASSES IN EFFECTIVE RANDOMNESS. THE IMPORTANCE OF Π 0 1 CLASSES IN EFFECTIVE RANDOMNESS. GEORGE BARMPALIAS, ANDREW E.M. LEWIS, AND KENG MENG NG Abstract. We prove a number of results in effective randomness, using methods in which Π

More information

The Church-Turing Thesis and Relative Recursion

The Church-Turing Thesis and Relative Recursion The Church-Turing Thesis and Relative Recursion Yiannis N. Moschovakis UCLA and University of Athens Amsterdam, September 7, 2012 The Church -Turing Thesis (1936) in a contemporary version: CT : For every

More information

A Hierarchy of c.e. degrees, unifying classes and natural definability

A Hierarchy of c.e. degrees, unifying classes and natural definability A Hierarchy of c.e. degrees, unifying classes and natural definability Rod Downey Victoria University Wellington New Zealand Oberwolfach, February 2012 REFERENCES Main project joint work with Noam Greenberg

More information

Computability Theory

Computability Theory Computability Theory Cristian S. Calude May 2012 Computability Theory 1 / 1 Bibliography M. Sipser. Introduction to the Theory of Computation, PWS 1997. (textbook) Computability Theory 2 / 1 Supplementary

More information

Handouts. CS701 Theory of Computation

Handouts. CS701 Theory of Computation Handouts CS701 Theory of Computation by Kashif Nadeem VU Student MS Computer Science LECTURE 01 Overview In this lecturer the topics will be discussed including The Story of Computation, Theory of Computation,

More information

PROBABILITY MEASURES AND EFFECTIVE RANDOMNESS

PROBABILITY MEASURES AND EFFECTIVE RANDOMNESS PROBABILITY MEASURES AND EFFECTIVE RANDOMNESS JAN REIMANN AND THEODORE A. SLAMAN ABSTRACT. We study the question, For which reals x does there exist a measure µ such that x is random relative to µ? We

More information

Reconciling data compression and Kolmogorov complexity

Reconciling data compression and Kolmogorov complexity Reconciling data compression and Kolmogorov complexity Laurent Bienvenu 1 and Wolfgang Merkle 2 1 Laboratoire d Informatique Fondamentale, Université de Provence, Marseille, France, laurent.bienvenu@lif.univ-mrs.fr

More information

LOWNESS FOR THE CLASS OF SCHNORR RANDOM REALS

LOWNESS FOR THE CLASS OF SCHNORR RANDOM REALS LOWNESS FOR THE CLASS OF SCHNORR RANDOM REALS BJØRN KJOS-HANSSEN, ANDRÉ NIES, AND FRANK STEPHAN Abstract. We answer a question of Ambos-Spies and Kučera in the affirmative. They asked whether, when a real

More information

Proof Theory and Subsystems of Second-Order Arithmetic

Proof Theory and Subsystems of Second-Order Arithmetic Proof Theory and Subsystems of Second-Order Arithmetic 1. Background and Motivation Why use proof theory to study theories of arithmetic? 2. Conservation Results Showing that if a theory T 1 proves ϕ,

More information

LOWNESS FOR BOUNDED RANDOMNESS

LOWNESS FOR BOUNDED RANDOMNESS LOWNESS FOR BOUNDED RANDOMNESS ROD DOWNEY AND KENG MENG NG Abstract. In [3], Brodhead, Downey and Ng introduced some new variations of the notions of being Martin-Löf random where the tests are all clopen

More information

Lecture 21: Algebraic Computation Models

Lecture 21: Algebraic Computation Models princeton university cos 522: computational complexity Lecture 21: Algebraic Computation Models Lecturer: Sanjeev Arora Scribe:Loukas Georgiadis We think of numerical algorithms root-finding, gaussian

More information

Correspondence Principles for Effective Dimensions

Correspondence Principles for Effective Dimensions Correspondence Principles for Effective Dimensions John M. Hitchcock Department of Computer Science Iowa State University Ames, IA 50011 jhitchco@cs.iastate.edu Abstract We show that the classical Hausdorff

More information

Computability Theory

Computability Theory Computability Theory Domination, Measure, Randomness, and Reverse Mathematics Peter Cholak University of Notre Dame Department of Mathematics Peter.Cholak.1@nd.edu http://www.nd.edu/~cholak/papers/nyc2.pdf

More information

Recursion Theory. Joost J. Joosten

Recursion Theory. Joost J. Joosten Recursion Theory Joost J. Joosten Institute for Logic Language and Computation University of Amsterdam Plantage Muidergracht 24 1018 TV Amsterdam Room P 3.26, +31 20 5256095 jjoosten@phil.uu.nl www.phil.uu.nl/

More information

Kolmogorov complexity and its applications

Kolmogorov complexity and its applications CS860, Winter, 2010 Kolmogorov complexity and its applications Ming Li School of Computer Science University of Waterloo http://www.cs.uwaterloo.ca/~mli/cs860.html We live in an information society. Information

More information

Kolmogorov-Loveland Randomness and Stochasticity

Kolmogorov-Loveland Randomness and Stochasticity Kolmogorov-Loveland Randomness and Stochasticity Wolfgang Merkle Institut für Informatik, Ruprecht-Karls-Universität Heidelberg Joseph S. Miller Department of Mathematics, University of Connecticut, Storrs

More information

EFFECTIVE FRACTAL DIMENSIONS

EFFECTIVE FRACTAL DIMENSIONS EFFECTIVE FRACTAL DIMENSIONS Jack H. Lutz Iowa State University Copyright c Jack H. Lutz, 2011. All rights reserved. Lectures 1. Classical and Constructive Fractal Dimensions 2. Dimensions of Points in

More information

June 26, 2006 Master Review Vol. 9in x 6in (for Lecture Note Series, IMS, NUS) Singapore. Publishers page

June 26, 2006 Master Review Vol. 9in x 6in (for Lecture Note Series, IMS, NUS) Singapore. Publishers page Publishers page Publishers page Publishers page Publishers page CONTENTS Foreword Preface vii ix Five Lectures on Algorithmic Randomness Rod Downey 1 v vi Contents FOREWORD a foreword vii viii Foreword

More information

CS 301. Lecture 17 Church Turing thesis. Stephen Checkoway. March 19, 2018

CS 301. Lecture 17 Church Turing thesis. Stephen Checkoway. March 19, 2018 CS 301 Lecture 17 Church Turing thesis Stephen Checkoway March 19, 2018 1 / 17 An abridged modern history of formalizing algorithms An algorithm is a finite, unambiguous sequence of steps for solving a

More information

The Locale of Random Sequences

The Locale of Random Sequences 1/31 The Locale of Random Sequences Alex Simpson LFCS, School of Informatics University of Edinburgh, UK The Locale of Random Sequences 2/31 The Application Problem in probability What is the empirical

More information

CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010

CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010 CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010 Computational complexity studies the amount of resources necessary to perform given computations.

More information

The axiomatic power of Kolmogorov complexity

The axiomatic power of Kolmogorov complexity The axiomatic power of Kolmogorov complexity Laurent Bienvenu 1, Andrei Romashchenko 2, Alexander Shen 2, Antoine Taveneaux 1, and Stijn Vermeeren 3 1 LIAFA, CNRS & Université Paris 7 2 LIRMM, CNRS & Université

More information

Kolmogorov complexity and its applications

Kolmogorov complexity and its applications Spring, 2009 Kolmogorov complexity and its applications Paul Vitanyi Computer Science University of Amsterdam http://www.cwi.nl/~paulv/course-kc We live in an information society. Information science is

More information

Time-bounded Kolmogorov complexity and Solovay functions

Time-bounded Kolmogorov complexity and Solovay functions Theory of Computing Systems manuscript No. (will be inserted by the editor) Time-bounded Kolmogorov complexity and Solovay functions Rupert Hölzl Thorsten Kräling Wolfgang Merkle Received: date / Accepted:

More information

PARTIAL RANDOMNESS AND KOLMOGOROV COMPLEXITY

PARTIAL RANDOMNESS AND KOLMOGOROV COMPLEXITY The Pennsylvania State University The Graduate School Eberly College of Science PARTIAL RANDOMNESS AND KOLMOGOROV COMPLEXITY A Dissertation in Mathematics by W.M. Phillip Hudelson 2013 W.M. Phillip Hudelson

More information

CISC 876: Kolmogorov Complexity

CISC 876: Kolmogorov Complexity March 27, 2007 Outline 1 Introduction 2 Definition Incompressibility and Randomness 3 Prefix Complexity Resource-Bounded K-Complexity 4 Incompressibility Method Gödel s Incompleteness Theorem 5 Outline

More information

Resource-bounded Forcing Theorem and Randomness

Resource-bounded Forcing Theorem and Randomness Resource-bounded Forcing Theorem and Randomness Toshio Suzuki 1 and Masahiro Kumabe 2 1 Tokyo Metropolitan University, toshio-suzuki tmu.ac.jp 2 The Open University of Japan, kumabe ouj.ac.jp ( =atmark)

More information

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 20. To Infinity And Beyond: Countability and Computability

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 20. To Infinity And Beyond: Countability and Computability EECS 70 Discrete Mathematics and Probability Theory Spring 014 Anant Sahai Note 0 To Infinity And Beyond: Countability and Computability This note ties together two topics that might seem like they have

More information

Is there an Elegant Universal Theory of Prediction?

Is there an Elegant Universal Theory of Prediction? Is there an Elegant Universal Theory of Prediction? Shane Legg Dalle Molle Institute for Artificial Intelligence Manno-Lugano Switzerland 17th International Conference on Algorithmic Learning Theory Is

More information

STRUCTURES WITH MINIMAL TURING DEGREES AND GAMES, A RESEARCH PROPOSAL

STRUCTURES WITH MINIMAL TURING DEGREES AND GAMES, A RESEARCH PROPOSAL STRUCTURES WITH MINIMAL TURING DEGREES AND GAMES, A RESEARCH PROPOSAL ZHENHAO LI 1. Structures whose degree spectrum contain minimal elements/turing degrees 1.1. Background. Richter showed in [Ric81] that

More information

July 28, 2010 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION

July 28, 2010 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION July 28, 2010 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION NOAM GREENBERG AND JOSEPH S. MILLER Abstract. We prove that every sufficiently slow growing DNR function computes a real

More information

RANDOMNESS BEYOND LEBESGUE MEASURE

RANDOMNESS BEYOND LEBESGUE MEASURE RANDOMNESS BEYOND LEBESGUE MEASURE JAN REIMANN ABSTRACT. Much of the recent research on algorithmic randomness has focused on randomness for Lebesgue measure. While, from a computability theoretic point

More information

(9-5) Def Turing machine, Turing computable functions. (9-7) Def Primitive recursive functions, Primitive recursive implies Turing computable.

(9-5) Def Turing machine, Turing computable functions. (9-7) Def Primitive recursive functions, Primitive recursive implies Turing computable. (9-5) Def Turing machine, Turing computable functions. A.Miller M773 Computability Theory Fall 2001 (9-7) Def Primitive recursive functions, Primitive recursive implies Turing computable. 1. Prove f(n)

More information

Constructive Dimension and Weak Truth-Table Degrees

Constructive Dimension and Weak Truth-Table Degrees Constructive Dimension and Weak Truth-Table Degrees Laurent Bienvenu 1, David Doty 2 and Frank Stephan 3 1 Laboratoire d Informatique Fondamentale de Marseille, Université de Provence, 39 rue Joliot-Curie,

More information

The Two Faces of Infinity Dr. Bob Gardner Great Ideas in Science (BIOL 3018)

The Two Faces of Infinity Dr. Bob Gardner Great Ideas in Science (BIOL 3018) The Two Faces of Infinity Dr. Bob Gardner Great Ideas in Science (BIOL 3018) From the webpage of Timithy Kohl, Boston University INTRODUCTION Note. We will consider infinity from two different perspectives:

More information

Large Numbers, Busy Beavers, Noncomputability and Incompleteness

Large Numbers, Busy Beavers, Noncomputability and Incompleteness Large Numbers, Busy Beavers, Noncomputability and Incompleteness Food For Thought November 1, 2007 Sam Buss Department of Mathematics U.C. San Diego PART I Large Numbers, Busy Beavers, and Undecidability

More information

INCOMPLETENESS I by Harvey M. Friedman Distinguished University Professor Mathematics, Philosophy, Computer Science Ohio State University Invitation

INCOMPLETENESS I by Harvey M. Friedman Distinguished University Professor Mathematics, Philosophy, Computer Science Ohio State University Invitation INCOMPLETENESS I by Harvey M. Friedman Distinguished University Professor Mathematics, Philosophy, Computer Science Ohio State University Invitation to Mathematics Series Department of Mathematics Ohio

More information

Mass Problems. Stephen G. Simpson. Pennsylvania State University NSF DMS , DMS

Mass Problems. Stephen G. Simpson. Pennsylvania State University NSF DMS , DMS Mass Problems Stephen G. Simpson Pennsylvania State University NSF DMS-0600823, DMS-0652637 http://www.math.psu.edu/simpson/ simpson@math.psu.edu Logic Seminar Department of Mathematics University of Chicago

More information

Introduction to Turing Machines

Introduction to Turing Machines Introduction to Turing Machines Deepak D Souza Department of Computer Science and Automation Indian Institute of Science, Bangalore. 12 November 2015 Outline 1 Turing Machines 2 Formal definitions 3 Computability

More information

ENEE 459E/CMSC 498R In-class exercise February 10, 2015

ENEE 459E/CMSC 498R In-class exercise February 10, 2015 ENEE 459E/CMSC 498R In-class exercise February 10, 2015 In this in-class exercise, we will explore what it means for a problem to be intractable (i.e. it cannot be solved by an efficient algorithm). There

More information

Some results on effective randomness

Some results on effective randomness Some results on effective randomness (Preliminary version September 2003) Wolfgang Merkle 1, Nenad Mihailović 1, and Theodore A. Slaman 2 1 Ruprecht-Karls-Universität Heidelberg, Institut für Informatik,

More information

Well-foundedness of Countable Ordinals and the Hydra Game

Well-foundedness of Countable Ordinals and the Hydra Game Well-foundedness of Countable Ordinals and the Hydra Game Noah Schoem September 11, 2014 1 Abstract An argument involving the Hydra game shows why ACA 0 is insufficient for a theory of ordinals in which

More information

Lecture 14 Rosser s Theorem, the length of proofs, Robinson s Arithmetic, and Church s theorem. Michael Beeson

Lecture 14 Rosser s Theorem, the length of proofs, Robinson s Arithmetic, and Church s theorem. Michael Beeson Lecture 14 Rosser s Theorem, the length of proofs, Robinson s Arithmetic, and Church s theorem Michael Beeson The hypotheses needed to prove incompleteness The question immediate arises whether the incompleteness

More information

Computability Theory. CS215, Lecture 6,

Computability Theory. CS215, Lecture 6, Computability Theory CS215, Lecture 6, 2000 1 The Birth of Turing Machines At the end of the 19th century, Gottlob Frege conjectured that mathematics could be built from fundamental logic In 1900 David

More information

Computation. Some history...

Computation. Some history... Computation Motivating questions: What does computation mean? What are the similarities and differences between computation in computers and in natural systems? What are the limits of computation? Are

More information

The Legacy of Hilbert, Gödel, Gentzen and Turing

The Legacy of Hilbert, Gödel, Gentzen and Turing The Legacy of Hilbert, Gödel, Gentzen and Turing Amílcar Sernadas Departamento de Matemática - Instituto Superior Técnico Security and Quantum Information Group - Instituto de Telecomunicações TULisbon

More information

6-1 Computational Complexity

6-1 Computational Complexity 6-1 Computational Complexity 6. Computational Complexity Computational models Turing Machines Time complexity Non-determinism, witnesses, and short proofs. Complexity classes: P, NP, conp Polynomial-time

More information

Random Reals à la Chaitin with or without prefix-freeness

Random Reals à la Chaitin with or without prefix-freeness Random Reals à la Chaitin with or without prefix-freeness Verónica Becher Departamento de Computación, FCEyN Universidad de Buenos Aires - CONICET Argentina vbecher@dc.uba.ar Serge Grigorieff LIAFA, Université

More information

Time-Bounded Kolmogorov Complexity and Solovay Functions

Time-Bounded Kolmogorov Complexity and Solovay Functions Time-Bounded Kolmogorov Complexity and Solovay Functions Rupert Hölzl, Thorsten Kräling, and Wolfgang Merkle Institut für Informatik, Ruprecht-Karls-Universität, Heidelberg, Germany Abstract. A Solovay

More information

On approximate decidability of minimal programs 1

On approximate decidability of minimal programs 1 On approximate decidability of minimal programs 1 Jason Teutsch June 12, 2014 1 Joint work with Marius Zimand How not to estimate complexity What s hard about Kolmogorov complexity? the case of functions

More information

March 12, 2011 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION

March 12, 2011 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION March 12, 2011 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION NOAM GREENBERG AND JOSEPH S. MILLER Abstract. We prove that every sufficiently slow growing DNR function computes a real

More information

Chaitin Ω Numbers and Halting Problems

Chaitin Ω Numbers and Halting Problems Chaitin Ω Numbers and Halting Problems Kohtaro Tadaki Research and Development Initiative, Chuo University CREST, JST 1 13 27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan E-mail: tadaki@kc.chuo-u.ac.jp Abstract.

More information

Alan Turing in the Twenty-first Century: Normal Numbers, Randomness, and Finite Automata. Jack Lutz Iowa State University

Alan Turing in the Twenty-first Century: Normal Numbers, Randomness, and Finite Automata. Jack Lutz Iowa State University Alan Turing in the Twenty-first Century: Normal Numbers, Randomness, and Finite Automata Jack Lutz Iowa State University Main reference A note on normal numbers in and Main references on main reference

More information

Minimal weak truth table degrees and computably enumerable Turing degrees. Rod Downey Keng Meng Ng Reed Solomon

Minimal weak truth table degrees and computably enumerable Turing degrees. Rod Downey Keng Meng Ng Reed Solomon Minimal weak truth table degrees and computably enumerable Turing degrees Rod Downey Keng Meng Ng Reed Solomon May 9, 2017 2 Chapter 1 Introduction Computability theorists have studied many different reducibilities

More information

Monotonically Computable Real Numbers

Monotonically Computable Real Numbers Monotonically Computable Real Numbers Robert Rettinger a, Xizhong Zheng b,, Romain Gengler b, Burchard von Braunmühl b a Theoretische Informatik II, FernUniversität Hagen, 58084 Hagen, Germany b Theoretische

More information

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero Chapter Limits of Sequences Calculus Student: lim s n = 0 means the s n are getting closer and closer to zero but never gets there. Instructor: ARGHHHHH! Exercise. Think of a better response for the instructor.

More information

The Logical Approach to Randomness

The Logical Approach to Randomness The Logical Approach to Randomness Christopher P. Porter University of Florida UC Irvine C-ALPHA Seminar May 12, 2015 Introduction The concept of randomness plays an important role in mathematical practice,

More information

Symbolic Dynamics: Entropy = Dimension = Complexity

Symbolic Dynamics: Entropy = Dimension = Complexity Symbolic Dynamics: Entropy = Dimension = omplexity Stephen G. Simpson Pennsylvania State University http://www.math.psu.edu/simpson/ simpson@math.psu.edu Worshop on Infinity and Truth Institute for Mathematical

More information

Church s undecidability result

Church s undecidability result Church s undecidability result Alan Turing Birth Centennial Talk at IIT Bombay, Mumbai Joachim Breitner April 21, 2011 Welcome, and thank you for the invitation to speak about Church s lambda calculus

More information

Polynomial-Time Random Oracles and Separating Complexity Classes

Polynomial-Time Random Oracles and Separating Complexity Classes Polynomial-Time Random Oracles and Separating Complexity Classes John M. Hitchcock Department of Computer Science University of Wyoming jhitchco@cs.uwyo.edu Adewale Sekoni Department of Computer Science

More information

Uncountable computable model theory

Uncountable computable model theory Uncountable computable model theory Noam Greenberg Victoria University of Wellington 30 th June 2013 Mathematical structures Model theory provides an abstract formalisation of the notion of a mathematical

More information

Alan Turing s Contributions to the Sciences

Alan Turing s Contributions to the Sciences Alan Turing s Contributions to the Sciences Prakash Panangaden School of Computer Science McGill University User-Defined Placeholder Text 1 2 3 Who is Alan Turing? Logician Mathematician Cryptanalyst Computer

More information

Most General computer?

Most General computer? Turing Machines Most General computer? DFAs are simple model of computation. Accept only the regular languages. Is there a kind of computer that can accept any language, or compute any function? Recall

More information

EFFECTIVE SYMBOLIC DYNAMICS AND COMPLEXITY

EFFECTIVE SYMBOLIC DYNAMICS AND COMPLEXITY EFFECTIVE SYMBOLIC DYNAMICS AND COMPLEXITY By FERIT TOSKA A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR

More information

Limitations of Efficient Reducibility to the Kolmogorov Random Strings

Limitations of Efficient Reducibility to the Kolmogorov Random Strings Limitations of Efficient Reducibility to the Kolmogorov Random Strings John M. HITCHCOCK 1 Department of Computer Science, University of Wyoming Abstract. We show the following results for polynomial-time

More information

Weak Lowness Notions for Kolmogorov Complexity. Ian-Cadoc Robertson Herbert

Weak Lowness Notions for Kolmogorov Complexity. Ian-Cadoc Robertson Herbert Weak Lowness Notions for Kolmogorov Complexity by Ian-Cadoc Robertson Herbert A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Logic and the

More information

DR.RUPNATHJI( DR.RUPAK NATH )

DR.RUPNATHJI( DR.RUPAK NATH ) Contents 1 Sets 1 2 The Real Numbers 9 3 Sequences 29 4 Series 59 5 Functions 81 6 Power Series 105 7 The elementary functions 111 Chapter 1 Sets It is very convenient to introduce some notation and terminology

More information

CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression

CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression Rosencrantz & Guildenstern Are Dead (Tom Stoppard) Rigged Lottery? And the winning numbers are: 1, 2, 3, 4, 5, 6 But is

More information

18.175: Lecture 2 Extension theorems, random variables, distributions

18.175: Lecture 2 Extension theorems, random variables, distributions 18.175: Lecture 2 Extension theorems, random variables, distributions Scott Sheffield MIT Outline Extension theorems Characterizing measures on R d Random variables Outline Extension theorems Characterizing

More information

MEASURES AND THEIR RANDOM REALS

MEASURES AND THEIR RANDOM REALS MEASURES AND THEIR RANDOM REALS JAN REIMANN AND THEODORE A. SLAMAN Abstract. We study the randomness properties of reals with respect to arbitrary probability measures on Cantor space. We show that every

More information

Selected Topics in the Theory of the Computably Enumerable Sets and Degrees

Selected Topics in the Theory of the Computably Enumerable Sets and Degrees Selected Topics in the Theory of the Computably Enumerable Sets and Degrees Klaus Ambos-Spies Ruprecht-Karls-Universität Heidelberg Institut für Informatik Handout 3-8 November 2017 4 Effective reducibilities

More information