Recovering randomness from an asymptotic Hamming distance

Size: px
Start display at page:

Download "Recovering randomness from an asymptotic Hamming distance"

Transcription

1 Recovering randomness from an asymptotic Hamming distance Bjørn Kjos-Hanssen March 23, 2011, Workshop in Computability U. San Francisco

2 Recovering randomness from an asymptotic Hamming distance Bjørn Kjos-Hanssen Consultants: Cameron E. Freer and Joseph S. Miller March 23, 2011, Workshop in Computability U. San Francisco

3 Kolmogorov s probability experiment definition Understanding randomness

4 Understanding randomness Kolmogorov s probability experiment definition: Ω = sample space,

5 Understanding randomness Kolmogorov s probability experiment definition: Ω = sample space, P(A) [0, 1] for A Ω (A in a σ-algebra),

6 Understanding randomness Kolmogorov s probability experiment definition: Ω = sample space, P(A) [0, 1] for A Ω (A in a σ-algebra), P is countably additive, and

7 Understanding randomness Kolmogorov s probability experiment definition: Ω = sample space, P(A) [0, 1] for A Ω (A in a σ-algebra), P is countably additive, and P( ) = 0, P(Ω) = 1.

8 Understanding randomness Kolmogorov s probability experiment definition: Ω = sample space, P(A) [0, 1] for A Ω (A in a σ-algebra), P is countably additive, and P( ) = 0, P(Ω) = 1. Randomness is introduced by a phrase like

9 Understanding randomness Kolmogorov s probability experiment definition: Ω = sample space, P(A) [0, 1] for A Ω (A in a σ-algebra), P is countably additive, and P( ) = 0, P(Ω) = 1. Randomness is introduced by a phrase like If x Ω is chosen at random, then P(x A) = P(A) =...

10 Understanding randomness Kolmogorov s probability experiment definition: Ω = sample space, P(A) [0, 1] for A Ω (A in a σ-algebra), P is countably additive, and P( ) = 0, P(Ω) = 1. Randomness is introduced by a phrase like If x Ω is chosen at random, then P(x A) = P(A) =... What aspects of randomness are useful?

11 A webmaster s use of randomness

12 A webmaster s use of randomness

13 A webmaster s use of randomness 7 news items to display on web page, each with probability 1/7

14 A webmaster s use of randomness 7 news items to display on web page, each with probability 1/7 Do not want to store a record of what has been displayed, so cycling through the items is not an option.

15 A webmaster s use of randomness 7 news items to display on web page, each with probability 1/7 Do not want to store a record of what has been displayed, so cycling through the items is not an option. Do not want to display the same item too often

16 A webmaster s use of randomness 7 news items to display on web page, each with probability 1/7 Do not want to store a record of what has been displayed, so cycling through the items is not an option. Do not want to display the same item too often Sequence of displayed items should satisfy strong law of large numbers

17 A webmaster s use of randomness 7 news items to display on web page, each with probability 1/7 Do not want to store a record of what has been displayed, so cycling through the items is not an option. Do not want to display the same item too often Sequence of displayed items should satisfy strong law of large numbers Not just overall, but in each user s experience

18 A webmaster s use of randomness

19 A webmaster s use of randomness currentnewsid := Math.floor(Math.random()*7); makenews(currentnewsid);

20 A webmaster s use of randomness currentnewsid := Math.floor(Math.random()*7); makenews(currentnewsid); Hopefully Math.random() gives random enough output.

21 Random enough Definition (Martin-Löf, 1966) x 2 ω is random enough if for each uniformly Σ 0 1 sequence of sets U n 2 ω with P(U n ) 2 n, we have x n U n.

22 Random enough Definition (Martin-Löf, 1966) x 2 ω is random enough if for each uniformly Σ 0 1 sequence of sets U n 2 ω with P(U n ) 2 n, we have x n U n. Example U n = those outcomes where the AMS meeting news item appears more than 1/7 + ε of the time at some point after time n.

23 Theorem (Schnorr/Levin, 1970s) x is Martin-Löf random K(A n) n c for all n, where c is a constant and K is prefix-free Kolmogorov complexity.

24 Theorem (Schnorr/Levin, 1970s) x is Martin-Löf random K(A n) n c for all n, where c is a constant and K is prefix-free Kolmogorov complexity. Definition (K,Merkle,Stephan 2006) A real x is complex if its prefixes have Kolmogorov complexity bounded below by an order function (an unbounded, nondecreasing computable function). For example, x is complex if K(x n) n + log n...

25 Stochastic immunity Definition A set X is immune if for each N C, N X. If ω \ X is immune then X is co-immune. If X is both immune and co-immune then X is bi-immune. Definition A set X is stochastically bi-immune if for each set N C, X N satisfies the strong law of large numbers, i.e., X N n lim = 1 n N n 2.

26 Stochastic immunity Definition A set X is immune if for each N C, N X. If ω \ X is immune then X is co-immune. If X is both immune and co-immune then X is bi-immune. Definition A set X is stochastically bi-immune if for each set N C, X N satisfies the strong law of large numbers, i.e., X N n lim = 1 n N n 2. Could also be called: weakly Mises-Wald-Church stochastic

27 Definition A sequence X 2 ω is Mises-Wald-Church stochastic if no partial computable monotonic selection rule can select a biased subsequence of X, i.e., a subsequence where the relative frequencies of 0s and 1s do not converge to 1/2.

28 Question Is every stochastically bi-immune set complex?

29 Question Is every stochastically bi-immune set complex? Doubtful, but the answer is almost yes.

30 Question Is every complex set stochastically bi-immune?

31 Question Is every complex set stochastically bi-immune? No, consider x.

32 Question Is every complex set stochastically bi-immune? No, consider x. Question Can we from a complex set compute a stochastically bi-immune set?

33 Medvedev reducibility

34 Theorem (Downey, Greenberg, Jockusch, Milans 2009) MLR s DNR 3

35 Theorem (Downey, Greenberg, Jockusch, Milans 2009) MLR s DNR 3 Corollary (inspection of the proof) MWC-stochastic s DNR 3

36 Theorem (Downey, Greenberg, Jockusch, Milans 2009) MLR s DNR 3 Corollary (inspection of the proof) MWC-stochastic s DNR 3 Theorem (K, Merkle, Stephan 2006) Complex s DNR 3

37 Theorem (Downey, Greenberg, Jockusch, Milans 2009) MLR s DNR 3 Corollary (inspection of the proof) MWC-stochastic s DNR 3 Theorem (K, Merkle, Stephan 2006) Complex s DNR 3 Corollary MWC-stochastic s Complex

38 Theorem (Downey, Greenberg, Jockusch, Milans 2009) MLR s DNR 3 Corollary (inspection of the proof) MWC-stochastic s DNR 3 Theorem (K, Merkle, Stephan 2006) Complex s DNR 3 Corollary MWC-stochastic s Complex Theorem (K, in preparation) weakly MWC-stochastic (stochastically bi-immune) s Complex

39 Relationship to Hamming distance Hamming distance d(σ, τ) is given by d(σ, τ) = {n : σ(n) τ(n)}. Let the collection of all infinite computable sets be denoted by C. Let p : ω ω. For X, Y 2 ω and N C we write X p,n Y ( n N) (d(x n, Y n) p(n)). X p Y X p,ω Y. X p Y X p,l Y ( L C). Easy to understand randomness extraction in terms of p. Seems hard in terms of p.

40 Can (almost exactly) characterize those functions p for which MLR s {y : y p x x MLR} If p(n) << n then yes.

41 Can (almost exactly) characterize those functions p for which MLR s {y : y p x x MLR} If p(n) << n then yes. Majority vote If p(n) >> n then no.

42 Can (almost exactly) characterize those functions p for which MLR s {y : y p x x MLR} If p(n) << n then yes. Majority vote If p(n) >> n then no. More complicated p(n) >> n is compatible with y being complex. This way we get stochastically bi-immune s complex. Can even ensure dim p (y) = 1 (where dim p is effective packing dimension).

43 Definition Define effective Hausdorff dimension, complex packing dimension, and effective packing dimension of a set A 2 ω by: dim H (A) = lim inf n ω dim cp (A) = sup inf N C n N dim p (A) = lim sup n ω K(A n) n K(A n) n K(A n) n (0 dim H dim cp dim p 1)

44 Main Theorem Let A and B with P(A) = P(B) = 1 be given by A = {X : X is weakly 3-random}, B = {X : X is stochastically bi-immune}. Theorem Let p : ω ω be any computable function such that p(n) = ω ( n). Let Φ be a Turing reduction. Then ( X A)( Y p X)(Φ Y B). Two cases: Φ maps almost every random to a random, or not.

45 Hamming distance For a set of strings A, d(x, A) = min{d(x, y) : y A}. The r-fold boundary of A {0, 1} n is Balls centered at 0 are given by {x {0, 1} n : 0 < d(x, A) r}. B(p) = {x : d(x, 0) p}, where 0 is the string of n many zeroes. A Hamming-sphere is a set H with B(p) H B(p + 1).

46 Hamming-spheres Theorem (Harper, 1966) For each k, n, r, one can find a Hamming-sphere that has minimal r-fold boundary among sets of cardinality k in {0, 1} n. So a ball is a set having minimal boundary just like in Euclidean space. The cardinality of B(p) is ( ( n 0) + n p). The r-fold boundary of B(p) is just the set B(p + r) \ B(p), which then has cardinality ( ) ( n p n p+r).

47 Proving the Main Theorem By Harper s theorem balls in Hamming space have minimum boundary (and generally, k-fold boundary) for given volume. Fortunately, balls still have pretty large (and known) boundary. So all sets have large boundary. Thus the adversary producing Y from X can often be on the boundary of the set {X : Φ X (n) = 1} and thus by making few changes set Φ = 0 or Φ = 1 as they desire.

48 Two kinds of Turing reductions 1 (Easy) λ{x ( Y = X) Φ Y MLR} = 1. Then for almost every X there is a Y which trivially satisfies Y p X, with Φ Y MLR. Unfortunately need X to be weakly 3-random! 2 (Hard) λ{x Φ X MLR} = 1. Then the averages of values Φ(n) approach 1/2 on each sequence of integers almost surely, and one can use a Harper style argument. Get for each 1-random set X a Y p X with Φ Y MLR.

49 The hard case: Φ maps randoms to randoms Problem is, Φ may have non-disjoint use. If Φ maps all randoms to randoms, then so does Φ σ, defined by Φ X σ(n) = Φ σ X (n) where σ X = σ(0)σ(1) σ(k 1)X(k)X(k + 1), k = σ. Technical detail: We can computably find infinitely many inputs n where the image measure of Φ σ is close to Lebesgue measure in a uniform way over all choices of σ of a fixed length. We then take σ to be the part of Y so far constructed.

50 Question Why did you obtain results for p but not for p? Lack of problem-solving... or a valid reason?

51 Justifying p, Part I Theorem (Law of the iterated logarithm, Khintchine 1924) Let X = (X 0, X 1,...) be a random variable on 2 ω having the fair-coin distribution. Let S n = n 1 k=0 X k = d(x n, n). Then with probability one, S n n 2 lim sup n ϕ(n) n = 1, 1 where ϕ(n) = 2 log log n.

52 Justifying p, Part I Theorem (Law of the iterated logarithm, Khintchine 1924) Let X = (X 0, X 1,...) be a random variable on 2 ω having the fair-coin distribution. Let S n = n 1 k=0 X k = d(x n, n). Then with probability one, S n n 2 lim sup n ϕ(n) n = 1, 1 where ϕ(n) = 2 log log n. The standard deviation of S n is simply n, so why the strange ϕ(n)?

53 Justifying p, Part I Theorem (Law of the iterated logarithm, Khintchine 1924) Let X = (X 0, X 1,...) be a random variable on 2 ω having the fair-coin distribution. Let S n = n 1 k=0 X k = d(x n, n). Then with probability one, S n n 2 lim sup n ϕ(n) n = 1, 1 where ϕ(n) = 2 log log n. The standard deviation of S n is simply n, so why the strange ϕ(n)? Michel Weber (1990): can replace ϕ by arbitrarily slow-growing function if we take lim sup n N for sparse set N.

54 Justifying p, Part II Theorem If X is Kurtz random relative to A and S n := d(x n, A n) then lim sup n S n n 2 ϕ(n) 1. n

55 Justifying p, Part II Theorem (K, 2007) A has non-dnr Turing degree = each Martin-Löf random set X is Kurtz random relative to A. (Deeper but less useful here: the converse also holds; Greenberg and J. Miller, 2009.) Corollary A p X, where p is computable, X is Martin-Löf random, and P({Y : Y p X}) = 0 = A has DNR degree. The amount of closeness p is sharp because P(Y p X) = P(Y p ), and does not have DNR degree.

56 The end

57 Review of year-old material

58 Can we extract randomness from an almost random sequence? Fix p : N N. Question: is there a procedure Φ such that if X {0, 1} N is random and Y p X, then Φ(Y) is random? We can call Φ a randomness extractor. Y is almost random, and Φ extracts randomness from Y.

59 Can we extract randomness from an almost random sequence? If our adversary must decide in advance which bits to corrupt, then parity will be a randomness extractor. Otherwise, one approach is to take Φ to be the Majority function on disjoint blocks of the input. Maj 5 (0, 0, 0, 1, 0) = 0 Maj 3 (x, y, z) = (x y) (y z) (x z) Maj 2n+1 (x 0,..., x 2n ) = I 2n, card(i)=n+1 i I x i Does this work if p is sufficiently slow-growing? At least if X is random then Maj(X) is random.

60 Majority - a randomness extractor? If X is random and Y p X, is Φ(Y) still random? Majority is somewhat robust or stable. Hamming distance on {0, 1} 2n+1 : d(σ, τ) = cardinality of {k σ(k) τ(k)}. Example: If d(σ, 0, 0, 0, 1, 0 ) 1 then Maj 5 (σ) = Maj 5 ( 0, 0, 0, 1, 0 ). We can change roughly n of the inputs and not alter the value of Majority. For large n, for most σ of length n, for all τ, d(σ, τ) significantly less than n Maj(σ) = Maj(τ).

61 Let X {0, 1} N be a random variable whose distribution is that of infinitely many mutually independent fair coin tosses. The standard deviation of S n = n i=1 X(n) from its mean n/2 is n/2. The Central Limit Theorem: ( P S n n 2 + a ) n 2 ( ) Sn n/2 = P a n/2 N(a) := 1 2π a e x2 /2 dx as n. Thus if a = a(n) then this probability 0, e.g. if a(n) = log n.

62 Can we compute a random sequence Z from an almost-random sequence Y? Y = 0, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1, 0, {0, 1} N X is random, Y p X, Z is computed from Y We can use Majority as follows: Y = 0, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1, 0,... }{{}}{{}}{{}}{{} 1, 0,... Z= 0, The block size is here 1, 3, 5,... [Block size must go to. Otherwise our adversary who constructs Y can affect the limiting frequency of 1s in Z on a computably selected sequence of blocks.] Actually each block should be larger than all previous blocks combined.

63 How close to the random set X must Y be for Majority to work? p(n) lim = 0 n n p(n) = o( n) If p(n) = o( n), X is random, and Y p X, then Z :=Majority(Y) is random.

64 Democracy is the best form of randomness extraction If X is random and Y p X where p(n) = o( n) then majority for Y is random. For any procedure Φ (such as Majority), if X is random and g(n) O( n) then there is a Y with Y g X such that Φ(Y) is not random. This phenomenon for random variables on {0, 1} n rather than individual elements of {0, 1} N is known in combinatorics / computer science. Harper 1966 Ben-Or and Linial 1989 Lichtenstein, Linial, and Saks 1989.

65 Positive result p(n) = o( n) Φ = majority function on disjoint blocks of increasing size For all ML-random X and all Y p X, Φ Y is ML-random. Negative result p(n) O( n) Almost optimal. Φ Turing reduction For almost all X there is a Y p X with Φ Y not ML-random.

Randomness extraction and asymptotic Hamming distance

Randomness extraction and asymptotic Hamming distance Randomness extraction and asymptotic Hamming distance Cameron E. Freer and Bjørn Kjos-Hanssen Abstract We obtain a non-implication result in the Medvedev degrees by studying sequences that are close to

More information

Kolmogorov-Loveland Randomness and Stochasticity

Kolmogorov-Loveland Randomness and Stochasticity Kolmogorov-Loveland Randomness and Stochasticity Wolfgang Merkle 1, Joseph Miller 2, André Nies 3, Jan Reimann 1, and Frank Stephan 4 1 Universität Heidelberg, Heidelberg, Germany 2 Indiana University,

More information

EFFECTIVE BI-IMMUNITY AND RANDOMNESS

EFFECTIVE BI-IMMUNITY AND RANDOMNESS EFFECTIVE BI-IMMUNITY AND RANDOMNESS ACHILLES A. BEROS, MUSHFEQ KHAN, AND BJØRN KJOS-HANSSEN Abstract. We study the relationship between randomness and effective biimmunity. Greenberg and Miller have shown

More information

March 12, 2011 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION

March 12, 2011 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION March 12, 2011 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION NOAM GREENBERG AND JOSEPH S. MILLER Abstract. We prove that every sufficiently slow growing DNR function computes a real

More information

Kolmogorov-Loveland Randomness and Stochasticity

Kolmogorov-Loveland Randomness and Stochasticity Kolmogorov-Loveland Randomness and Stochasticity Wolfgang Merkle Institut für Informatik, Ruprecht-Karls-Universität Heidelberg Joseph S. Miller Department of Mathematics, University of Connecticut, Storrs

More information

July 28, 2010 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION

July 28, 2010 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION July 28, 2010 DIAGONALLY NON-RECURSIVE FUNCTIONS AND EFFECTIVE HAUSDORFF DIMENSION NOAM GREENBERG AND JOSEPH S. MILLER Abstract. We prove that every sufficiently slow growing DNR function computes a real

More information

Kolmogorov-Loveland Randomness and Stochasticity

Kolmogorov-Loveland Randomness and Stochasticity Kolmogorov-Loveland Randomness and Stochasticity Wolfgang Merkle 1 Joseph Miller 2 André Nies 3 Jan Reimann 1 Frank Stephan 4 1 Institut für Informatik, Universität Heidelberg 2 Department of Mathematics,

More information

Randomness Beyond Lebesgue Measure

Randomness Beyond Lebesgue Measure Randomness Beyond Lebesgue Measure Jan Reimann Department of Mathematics University of California, Berkeley November 16, 2006 Measures on Cantor Space Outer measures from premeasures Approximate sets from

More information

The Metamathematics of Randomness

The Metamathematics of Randomness The Metamathematics of Randomness Jan Reimann January 26, 2007 (Original) Motivation Effective extraction of randomness In my PhD-thesis I studied the computational power of reals effectively random for

More information

Computability, Complexity and Randomness 2016

Computability, Complexity and Randomness 2016 Computability, Complexity and Randomness 2016 Permutations of the integers do not induce nontrivial automorphisms of the Turing degrees Bjørn Kjos-Hanssen June 25, 2015 Computability, Complexity and Randomness,

More information

From eventually different functions to pandemic numberings

From eventually different functions to pandemic numberings From eventually different functions to pandemic numberings Achilles A. Beros 1, Mushfeq Khan 1, Bjørn Kjos-Hanssen 1[0000 0002 6199 1755], and André Nies 2 1 University of Hawai i at Mānoa, Honolulu HI

More information

Computability Theory

Computability Theory Computability Theory Domination, Measure, Randomness, and Reverse Mathematics Peter Cholak University of Notre Dame Department of Mathematics Peter.Cholak.1@nd.edu Supported by NSF Grant DMS 02-45167 (USA).

More information

Computability Theory

Computability Theory Computability Theory Domination, Measure, Randomness, and Reverse Mathematics Peter Cholak University of Notre Dame Department of Mathematics Peter.Cholak.1@nd.edu http://www.nd.edu/~cholak/papers/nyc2.pdf

More information

ASYMPTOTIC DENSITY, COMPUTABLE TRACEABILITY, AND 1-RANDOMNESS. 1. Introduction

ASYMPTOTIC DENSITY, COMPUTABLE TRACEABILITY, AND 1-RANDOMNESS. 1. Introduction ASYMPTOTIC DENSITY, COMPUTABLE TRACEABILITY, AND 1-RANDOMNESS URI ANDREWS, MINGZHONG CAI, DAVID DIAMONDSTONE, CARL JOCKUSCH, AND STEFFEN LEMPP Abstract. Let r be a real number in the unit interval [0,

More information

Ten years of triviality

Ten years of triviality Ten years of triviality André Nies U of Auckland The Incomputable, Chicheley Hall, 2012 André Nies (U of Auckland) Ten years of triviality The Incomputable 1 / 19 K-trivials: synopsis During the last 10

More information

Shift-complex Sequences

Shift-complex Sequences University of Wisconsin Madison March 24th, 2011 2011 ASL North American Annual Meeting Berkeley, CA What are shift-complex sequences? K denotes prefix-free Kolmogorov complexity: For a string σ, K(σ)

More information

THE K-DEGREES, LOW FOR K DEGREES, AND WEAKLY LOW FOR K SETS

THE K-DEGREES, LOW FOR K DEGREES, AND WEAKLY LOW FOR K SETS THE K-DEGREES, LOW FOR K DEGREES, AND WEAKLY LOW FOR K SETS JOSEPH S. MILLER Abstract. We call A weakly low for K if there is a c such that K A (σ) K(σ) c for infinitely many σ; in other words, there are

More information

Effective randomness and computability

Effective randomness and computability University of Wisconsin October 2009 What is this about? Let s begin by examining the title: Effective randomness (from algorithmic point of view) Computability (study of the limits of algorithms) Algorithms

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

How much randomness is needed for statistics?

How much randomness is needed for statistics? How much randomness is needed for statistics? Bjørn Kjos-Hanssen a, Antoine Taveneaux b, Neil Thapen c a University of Hawai i at Mānoa, Honolulu, HI 96822, USA, b LIAFA, Université Paris Diderot - Paris

More information

Reconciling data compression and Kolmogorov complexity

Reconciling data compression and Kolmogorov complexity Reconciling data compression and Kolmogorov complexity Laurent Bienvenu 1 and Wolfgang Merkle 2 1 Laboratoire d Informatique Fondamentale, Université de Provence, Marseille, France, laurent.bienvenu@lif.univ-mrs.fr

More information

RESEARCH STATEMENT: MUSHFEQ KHAN

RESEARCH STATEMENT: MUSHFEQ KHAN RESEARCH STATEMENT: MUSHFEQ KHAN Contents 1. Overview 1 2. Notation and basic definitions 4 3. Shift-complex sequences 4 4. PA degrees and slow-growing DNC functions 5 5. Lebesgue density and Π 0 1 classes

More information

PARTIAL RANDOMNESS AND KOLMOGOROV COMPLEXITY

PARTIAL RANDOMNESS AND KOLMOGOROV COMPLEXITY The Pennsylvania State University The Graduate School Eberly College of Science PARTIAL RANDOMNESS AND KOLMOGOROV COMPLEXITY A Dissertation in Mathematics by W.M. Phillip Hudelson 2013 W.M. Phillip Hudelson

More information

The Locale of Random Sequences

The Locale of Random Sequences 1/31 The Locale of Random Sequences Alex Simpson LFCS, School of Informatics University of Edinburgh, UK The Locale of Random Sequences 2/31 The Application Problem in probability What is the empirical

More information

INDEX SETS OF UNIVERSAL CODES

INDEX SETS OF UNIVERSAL CODES INDEX SETS OF UNIVERSAL CODES ACHILLES A. BEROS AND KONSTANTINOS A. BEROS Abstract. We examine sets of codes such that certain properties are invariant under the choice of oracle from a range of possible

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

An analogy between cardinal characteristics and highness properties of oracles

An analogy between cardinal characteristics and highness properties of oracles 1/1013 An analogy between cardinal characteristics and highness properties of oracles André Nies CCR 2014 What would a new open questions paper on randomness contain? 2/1013 3/1013 Randomness connecting

More information

LOWNESS FOR THE CLASS OF SCHNORR RANDOM REALS

LOWNESS FOR THE CLASS OF SCHNORR RANDOM REALS LOWNESS FOR THE CLASS OF SCHNORR RANDOM REALS BJØRN KJOS-HANSSEN, ANDRÉ NIES, AND FRANK STEPHAN Abstract. We answer a question of Ambos-Spies and Kučera in the affirmative. They asked whether, when a real

More information

TT-FUNCTIONALS AND MARTIN-LÖF RANDOMNESS FOR BERNOULLI MEASURES

TT-FUNCTIONALS AND MARTIN-LÖF RANDOMNESS FOR BERNOULLI MEASURES TT-FUNCTIONALS AND MARTIN-LÖF RANDOMNESS FOR BERNOULLI MEASURES LOGAN AXON Abstract. For r [0, 1], the Bernoulli measure µ r on the Cantor space {0, 1} N assigns measure r to the set of sequences with

More information

SCHNORR DIMENSION RODNEY DOWNEY, WOLFGANG MERKLE, AND JAN REIMANN

SCHNORR DIMENSION RODNEY DOWNEY, WOLFGANG MERKLE, AND JAN REIMANN SCHNORR DIMENSION RODNEY DOWNEY, WOLFGANG MERKLE, AND JAN REIMANN ABSTRACT. Following Lutz s approach to effective (constructive) dimension, we define a notion of dimension for individual sequences based

More information

Symbolic Dynamics: Entropy = Dimension = Complexity

Symbolic Dynamics: Entropy = Dimension = Complexity Symbolic Dynamics: Entropy = Dimension = omplexity Stephen G. Simpson Pennsylvania State University http://www.math.psu.edu/simpson/ simpson@math.psu.edu Worshop on Infinity and Truth Institute for Mathematical

More information

Chapter 1 Probability Theory

Chapter 1 Probability Theory Review for the previous lecture Eample: how to calculate probabilities of events (especially for sampling with replacement) and the conditional probability Definition: conditional probability, statistically

More information

Algorithmically random closed sets and probability

Algorithmically random closed sets and probability Algorithmically random closed sets and probability Logan Axon January, 2009 University of Notre Dame Outline. 1. Martin-Löf randomness. 2. Previous approaches to random closed sets. 3. The space of closed

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

Correspondence Principles for Effective Dimensions

Correspondence Principles for Effective Dimensions Correspondence Principles for Effective Dimensions John M. Hitchcock Department of Computer Science Iowa State University Ames, IA 50011 jhitchco@cs.iastate.edu Abstract We show that the classical Hausdorff

More information

Math 564 Homework 1. Solutions.

Math 564 Homework 1. Solutions. Math 564 Homework 1. Solutions. Problem 1. Prove Proposition 0.2.2. A guide to this problem: start with the open set S = (a, b), for example. First assume that a >, and show that the number a has the properties

More information

The probability distribution as a computational resource for randomness testing

The probability distribution as a computational resource for randomness testing 1 13 ISSN 1759-9008 1 The probability distribution as a computational resource for randomness testing BJØRN KJOS-HANSSEN Abstract: When testing a set of data for randomness according to a probability distribution

More information

18.175: Lecture 2 Extension theorems, random variables, distributions

18.175: Lecture 2 Extension theorems, random variables, distributions 18.175: Lecture 2 Extension theorems, random variables, distributions Scott Sheffield MIT Outline Extension theorems Characterizing measures on R d Random variables Outline Extension theorems Characterizing

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Randomness for capacities

Randomness for capacities Randomness for capacities with applications to random closed sets Jason Rute Pennsylvania State University Computability, Complexity, and Randomness June 2014 Slides available at www.personal.psu.edu/jmr71/

More information

Lecture 3: Probability Measures - 2

Lecture 3: Probability Measures - 2 Lecture 3: Probability Measures - 2 1. Continuation of measures 1.1 Problem of continuation of a probability measure 1.2 Outer measure 1.3 Lebesgue outer measure 1.4 Lebesgue continuation of an elementary

More information

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

Hausdorff Measures and Perfect Subsets How random are sequences of positive dimension?

Hausdorff Measures and Perfect Subsets How random are sequences of positive dimension? Hausdorff Measures and Perfect Subsets How random are sequences of positive dimension? Jan Reimann Institut für Informatik, Universität Heidelberg Hausdorff Measures and Perfect Subsets p.1/18 Hausdorff

More information

Random point patterns and counting processes

Random point patterns and counting processes Chapter 7 Random point patterns and counting processes 7.1 Random point pattern A random point pattern satunnainen pistekuvio) on an interval S R is a locally finite 1 random subset of S, defined on some

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Muchnik Degrees: Results and Techniques

Muchnik Degrees: Results and Techniques Muchnik Degrees: Results and Techniques Stephen G. Simpson Department of Mathematics Pennsylvania State University http://www.math.psu.edu/simpson/ Draft: April 1, 2003 Abstract Let P and Q be sets of

More information

Mass Problems. Stephen G. Simpson. Pennsylvania State University NSF DMS , DMS

Mass Problems. Stephen G. Simpson. Pennsylvania State University NSF DMS , DMS Mass Problems Stephen G. Simpson Pennsylvania State University NSF DMS-0600823, DMS-0652637 http://www.math.psu.edu/simpson/ simpson@math.psu.edu Logic Seminar Department of Mathematics University of Chicago

More information

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018 Lecture 1 Stochastic Optimization: Introduction January 8, 2018 Optimization Concerned with mininmization/maximization of mathematical functions Often subject to constraints Euler (1707-1783): Nothing

More information

Theory of Computing Systems 1999 Springer-Verlag New York Inc.

Theory of Computing Systems 1999 Springer-Verlag New York Inc. Theory Comput. Systems 32, 517 529 (1999) Theory of Computing Systems 1999 Springer-Verlag New York Inc. Randomness, Stochasticity, and Approximations Y. Wang Department of Combinatorics and Optimization,

More information

VIC 2004 p.1/26. Institut für Informatik Universität Heidelberg. Wolfgang Merkle and Jan Reimann. Borel Normality, Automata, and Complexity

VIC 2004 p.1/26. Institut für Informatik Universität Heidelberg. Wolfgang Merkle and Jan Reimann. Borel Normality, Automata, and Complexity Borel Normality, Automata, and Complexity Wolfgang Merkle and Jan Reimann Institut für Informatik Universität Heidelberg VIC 2004 p.1/26 The Quest for Randomness Intuition: An infinite sequence of fair

More information

18.175: Lecture 14 Infinite divisibility and so forth

18.175: Lecture 14 Infinite divisibility and so forth 18.175 Lecture 14 18.175: Lecture 14 Infinite divisibility and so forth Scott Sheffield MIT 18.175 Lecture 14 Outline Infinite divisibility Higher dimensional CFs and CLTs Random walks Stopping times Arcsin

More information

Symbolic Dynamics: Entropy = Dimension = Complexity

Symbolic Dynamics: Entropy = Dimension = Complexity Symbolic Dynamics: Entropy = Dimension = omplexity Stephen G. Simpson Pennsylvania State University http://www.math.psu.edu/simpson/ simpson@math.psu.edu A-R onference University of ape Town January 3

More information

Statistical Inference

Statistical Inference Statistical Inference Lecture 1: Probability Theory MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 11, 2018 Outline Introduction Set Theory Basics of Probability Theory

More information

MATH/STAT 235A Probability Theory Lecture Notes, Fall 2013

MATH/STAT 235A Probability Theory Lecture Notes, Fall 2013 MATH/STAT 235A Probability Theory Lecture Notes, Fall 2013 Dan Romik Department of Mathematics, UC Davis December 30, 2013 Contents Chapter 1: Introduction 6 1.1 What is probability theory?...........................

More information

EFFECTIVE FRACTAL DIMENSIONS

EFFECTIVE FRACTAL DIMENSIONS EFFECTIVE FRACTAL DIMENSIONS Jack H. Lutz Iowa State University Copyright c Jack H. Lutz, 2011. All rights reserved. Lectures 1. Classical and Constructive Fractal Dimensions 2. Dimensions of Points in

More information

The Logical Approach to Randomness

The Logical Approach to Randomness The Logical Approach to Randomness Christopher P. Porter University of Florida UC Irvine C-ALPHA Seminar May 12, 2015 Introduction The concept of randomness plays an important role in mathematical practice,

More information

Limit Complexities Revisited

Limit Complexities Revisited DOI 10.1007/s00224-009-9203-9 Limit Complexities Revisited Laurent Bienvenu Andrej Muchnik Alexander Shen Nikolay Vereshchagin Springer Science+Business Media, LLC 2009 Abstract The main goal of this article

More information

Probability and Measure

Probability and Measure Probability and Measure Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Convergence of Random Variables 1. Convergence Concepts 1.1. Convergence of Real

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Finding paths through narrow and wide trees

Finding paths through narrow and wide trees Finding paths through narrow and wide trees Stephen Binns Department of Mathematics and Statistics King Fahd University of Petroleum and Minerals Dhahran 31261 Saudi Arabia Bjørn Kjos-Hanssen Department

More information

GENERIC COMPUTABILITY, TURING DEGREES, AND ASYMPTOTIC DENSITY

GENERIC COMPUTABILITY, TURING DEGREES, AND ASYMPTOTIC DENSITY GENERIC COMPUTABILITY, TURING DEGREES, AND ASYMPTOTIC DENSITY CARL G. JOCKUSCH, JR. AND PAUL E. SCHUPP Abstract. Generic decidability has been extensively studied in group theory, and we now study it in

More information

Functional Analysis HW #1

Functional Analysis HW #1 Functional Analysis HW #1 Sangchul Lee October 9, 2015 1 Solutions Solution of #1.1. Suppose that X

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Topic 5 Basics of Probability

Topic 5 Basics of Probability Topic 5 Basics of Probability Equally Likely Outcomes and the Axioms of Probability 1 / 13 Outline Equally Likely Outcomes Axioms of Probability Consequences of the Axioms 2 / 13 Introduction A probability

More information

Topics in Algorithmic Randomness and Computability Theory

Topics in Algorithmic Randomness and Computability Theory Topics in Algorithmic Randomness and Computability Theory by Michael Patrick McInerney A thesis submitted to the Victoria University of Wellington in fulfilment of the requirements for the degree of Doctor

More information

AN ANALOGY BETWEEN CARDINAL CHARACTERISTICS AND HIGHNESS PROPERTIES OF ORACLES

AN ANALOGY BETWEEN CARDINAL CHARACTERISTICS AND HIGHNESS PROPERTIES OF ORACLES AN ANALOGY BETWEEN CARDINAL CHARACTERISTICS AND HIGHNESS PROPERTIES OF ORACLES Abstract. We develop an analogy between cardinal characteristics from set theory and highness properties from computability

More information

Almost Sure Convergence of a Sequence of Random Variables

Almost Sure Convergence of a Sequence of Random Variables Almost Sure Convergence of a Sequence of Random Variables (...for people who haven t had measure theory.) 1 Preliminaries 1.1 The Measure of a Set (Informal) Consider the set A IR 2 as depicted below.

More information

Randomness Zoo. Antoine Taveneaux. To cite this version: HAL Id: hal

Randomness Zoo. Antoine Taveneaux. To cite this version: HAL Id: hal Randomness Zoo Antoine Taveneaux To cite this version: Antoine Taveneaux. Randomness Zoo. 2012. HAL Id: hal-00850992 https://hal.archives-ouvertes.fr/hal-00850992 Submitted on 10 Aug 2013

More information

A PECULIAR COIN-TOSSING MODEL

A PECULIAR COIN-TOSSING MODEL A PECULIAR COIN-TOSSING MODEL EDWARD J. GREEN 1. Coin tossing according to de Finetti A coin is drawn at random from a finite set of coins. Each coin generates an i.i.d. sequence of outcomes (heads or

More information

the time it takes until a radioactive substance undergoes a decay

the time it takes until a radioactive substance undergoes a decay 1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete

More information

KOLMOGOROV COMPLEXITY AND THE RECURSION THEOREM

KOLMOGOROV COMPLEXITY AND THE RECURSION THEOREM TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 00, Number 0, Pages 000 000 S 0002-9947(XX)0000-0 KOLMOGOROV COMPLEXITY AND THE RECURSION THEOREM BJØRN KJOS-HANSSEN, WOLFGANG MERKLE, AND FRANK

More information

Probability: from intuition to mathematics

Probability: from intuition to mathematics 1 / 28 Probability: from intuition to mathematics Ting-Kam Leonard Wong University of Southern California EPYMT 2017 2 / 28 Probability has a right and a left hand. On the right is the rigorous foundational

More information

Schnorr trivial sets and truth-table reducibility

Schnorr trivial sets and truth-table reducibility Schnorr trivial sets and truth-table reducibility Johanna N.Y. Franklin and Frank Stephan Abstract We give several characterizations of Schnorr trivial sets, including a new lowness notion for Schnorr

More information

Chapter 4. Measure Theory. 1. Measure Spaces

Chapter 4. Measure Theory. 1. Measure Spaces Chapter 4. Measure Theory 1. Measure Spaces Let X be a nonempty set. A collection S of subsets of X is said to be an algebra on X if S has the following properties: 1. X S; 2. if A S, then A c S; 3. if

More information

RANDOMNESS BEYOND LEBESGUE MEASURE

RANDOMNESS BEYOND LEBESGUE MEASURE RANDOMNESS BEYOND LEBESGUE MEASURE JAN REIMANN ABSTRACT. Much of the recent research on algorithmic randomness has focused on randomness for Lebesgue measure. While, from a computability theoretic point

More information

INTRODUCTION TO MARKOV CHAIN MONTE CARLO

INTRODUCTION TO MARKOV CHAIN MONTE CARLO INTRODUCTION TO MARKOV CHAIN MONTE CARLO 1. Introduction: MCMC In its simplest incarnation, the Monte Carlo method is nothing more than a computerbased exploitation of the Law of Large Numbers to estimate

More information

A User s Guide to Measure Theoretic Probability Errata and comments

A User s Guide to Measure Theoretic Probability Errata and comments A User s Guide to Measure Theoretic Probability Errata and comments Chapter 2. page 25, line -3: Upper limit on sum should be 2 4 n page 34, line -10: case of a probability measure page 35, line 20 23:

More information

Lecture 3: Random variables, distributions, and transformations

Lecture 3: Random variables, distributions, and transformations Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an

More information

DISCRETE MINIMAL ENERGY PROBLEMS

DISCRETE MINIMAL ENERGY PROBLEMS DISCRETE MINIMAL ENERGY PROBLEMS Lecture III E. B. Saff Center for Constructive Approximation Vanderbilt University University of Crete, Heraklion May, 2017 Random configurations ΩN = {X1, X2,..., XN }:

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

1 Arithmetic complexity via effective names for random sequences

1 Arithmetic complexity via effective names for random sequences 1 Arithmetic complexity via effective names for random sequences BJØRN KJOS-HANSSEN, University of Hawai i at Mānoa FRANK STEPHAN, National University of Singapore JASON R. TEUTSCH, Ruprecht-Karls-Universität

More information

RS Chapter 1 Random Variables 6/5/2017. Chapter 1. Probability Theory: Introduction

RS Chapter 1 Random Variables 6/5/2017. Chapter 1. Probability Theory: Introduction Chapter 1 Probability Theory: Introduction Basic Probability General In a probability space (Ω, Σ, P), the set Ω is the set of all possible outcomes of a probability experiment. Mathematically, Ω is just

More information

Non-conglomerability for countably additive measures that are not -additive* Mark Schervish, Teddy Seidenfeld, and Joseph Kadane CMU

Non-conglomerability for countably additive measures that are not -additive* Mark Schervish, Teddy Seidenfeld, and Joseph Kadane CMU Non-conglomerability for countably additive measures that are not -additive* Mark Schervish, Teddy Seidenfeld, and Joseph Kadane CMU Abstract Let be an uncountable cardinal. Using the theory of conditional

More information

Weak convergence. Amsterdam, 13 November Leiden University. Limit theorems. Shota Gugushvili. Generalities. Criteria

Weak convergence. Amsterdam, 13 November Leiden University. Limit theorems. Shota Gugushvili. Generalities. Criteria Weak Leiden University Amsterdam, 13 November 2013 Outline 1 2 3 4 5 6 7 Definition Definition Let µ, µ 1, µ 2,... be probability measures on (R, B). It is said that µ n converges weakly to µ, and we then

More information

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures 36-752 Spring 2014 Advanced Probability Overview Lecture Notes Set 1: Course Overview, σ-fields, and Measures Instructor: Jing Lei Associated reading: Sec 1.1-1.4 of Ash and Doléans-Dade; Sec 1.1 and A.1

More information

Measurable Functions and Random Variables

Measurable Functions and Random Variables Chapter 8 Measurable Functions and Random Variables The relationship between two measurable quantities can, strictly speaking, not be found by observation. Carl Runge What I don t like about measure theory

More information

Time-bounded Kolmogorov complexity and Solovay functions

Time-bounded Kolmogorov complexity and Solovay functions Theory of Computing Systems manuscript No. (will be inserted by the editor) Time-bounded Kolmogorov complexity and Solovay functions Rupert Hölzl Thorsten Kräling Wolfgang Merkle Received: date / Accepted:

More information

The complexity of stochastic sequences

The complexity of stochastic sequences The complexity of stochastic sequences Wolfgang Merkle Ruprecht-Karls-Universität Heidelberg Institut für Informatik Im Neuenheimer Feld 294 D-69120 Heidelberg, Germany merkle@math.uni-heidelberg.de Abstract

More information

Coding of memoryless sources 1/35

Coding of memoryless sources 1/35 Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems

More information

Inaugural Dissertation. zur. Erlangung der Doktorwürde. der. Naturwissenschaftlich-Mathematischen Gesamtfakultät. der. Ruprecht-Karls-Universität

Inaugural Dissertation. zur. Erlangung der Doktorwürde. der. Naturwissenschaftlich-Mathematischen Gesamtfakultät. der. Ruprecht-Karls-Universität Inaugural Dissertation zur Erlangung der Doktorwürde der Naturwissenschaftlich-Mathematischen Gesamtfakultät der Ruprecht-Karls-Universität Heidelberg vorgelegt von Yongge Wang aus Gansu, China 1996 Randomness

More information

Solution Set for Homework #1

Solution Set for Homework #1 CS 683 Spring 07 Learning, Games, and Electronic Markets Solution Set for Homework #1 1. Suppose x and y are real numbers and x > y. Prove that e x > ex e y x y > e y. Solution: Let f(s = e s. By the mean

More information

THE STRENGTH OF SOME COMBINATORIAL PRINCIPLES RELATED TO RAMSEY S THEOREM FOR PAIRS

THE STRENGTH OF SOME COMBINATORIAL PRINCIPLES RELATED TO RAMSEY S THEOREM FOR PAIRS THE STRENGTH OF SOME COMBINATORIAL PRINCIPLES RELATED TO RAMSEY S THEOREM FOR PAIRS DENIS R. HIRSCHFELDT, CARL G. JOCKUSCH, JR., BJØRN KJOS-HANSSEN, STEFFEN LEMPP, AND THEODORE A. SLAMAN Abstract. We study

More information

Solutions: Problem Set 4 Math 201B, Winter 2007

Solutions: Problem Set 4 Math 201B, Winter 2007 Solutions: Problem Set 4 Math 2B, Winter 27 Problem. (a Define f : by { x /2 if < x

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE Most of the material in this lecture is covered in [Bertsekas & Tsitsiklis] Sections 1.3-1.5

More information