Kolmogorov Complexity

Size: px
Start display at page:

Download "Kolmogorov Complexity"

Transcription

1 Kolmogorov Complexity Davide Basilio Bartolini University of Illinois at Chicago Politecnico di Milano 1 Abstract What follows is a survey of the field of Kolmogorov complexity. Kolmogorov complexity, named after the mathematician Andrej Nikolaevič Kolmogorov, is a measure of the algorithmic complexity of a certain object (represented as a string of symbols) in terms of how hard it is to describe it. This measure dispenses with the need to know the probability distribution that rules a certain object (in general, the distribution may not be defined at all) and is thus more general than Shannon s entropy, while describing basically the same fact from a slightly different point of view. This paper gives an overview of this field of study from the origins to some of the latest developments and applications. I. INTRODUCTION The idea of measuring the complexity inherent in a certain object in an algorithmic way was independently developed in the mid 1960s by Ray Solomonoff [12,13], Gregory John Chaitin [4] and Andrej Nikolaevič Kolmogorov [7]. This new measure (know today as Kolmogorov complexity, or algorithmic complexity) is defined according to a novel algorithmic way of giving a quantitative definition of information, in contrast with the combinatorial and probabilistic approaches that were already known at the time [7]. Roughly speaking, the new approach is to define the complexity of an object depending on how hard it is to describe it. The universal measure of the inherent difficulty of giving a description, needed for such an approach to be applicable, is provided by the theory of computation in terms of the length of the shortest program that, run on a universal Turing machine, will give the studied object (represented as a string of symbols) as its output. The solid basis provided by the theory of computation allows to apply algorithmic complexity to virtually any computable object, without the need for it to be ruled by a known probability distribution. A. Structure of the survey First, some useful concepts from the theory of computation are exposed (Section I-B), then a brief introduction to the main topic is given (Section II) and the basic definitions needed to understand the theory are presented (Section II-A); also, some of the most interesting results are reported (Section II-B) and parallels are made with classical information theory (Section III). Then, some works extending Kolmogorov complexity theory in different areas - namely games theory (Section IV-A), quantum information theory (Section IV-B) and computational complexity theory (Section IV-C) - are described, to give an idea of some developments of the original theory and links to other research areas. Last (Section V), a few applications of Kolmogorov complexity to real-life problems are shown, to highlight how this theory proved useful in understanding real phenomena and in solving practical problems. B. Notions from computability theory To understand the definition of algorithmic complexity, some notions from computability theory are needed, which are briefly pointed out in this section. In particular, the existence of a universal model to evaluate the difficulty of describing a certain string is crucial to permit the definition of algorithmic complexity. This model is provided by the Turing machine, described by Alan Turing in 1937 and its universality comes from Church s thesis, which states that all (sufficiently complex) computational models are equivalent (i.e. they can compute the same family of functions) and each of them can be simulated by a universal Turing machine [5]. Now that the use of the Turing machine within this context is clear, let us define this computational model.

2 2 1) Turing Machine: Many equivalent definitions (more or less formal) are possible for a Turing machine; a formal definition is not necessary here and an intuitive idea should suffice. Intuitively, a 3-tape-symbol bounded transfer Turing machine [4] is a logical device formed by a control module (which is a finite-state automaton) and three tapes that can contain symbols or blanks; the tapes can be read/written, one symbol at a time, through dedicated heads. The machine is able to execute instructions written on its input tape (which is read-only) by changing its internal state, reading and writing symbols on its work tape and providing output by writing on its output tape. A graphical representation of such a device is shown in Figure 1, where the arrows indicate the symbol over which the heads are placed. From a mathematical point of view, a Turing machine can be viewed Fig. 1. Graphical representation of a Turing machine (adapted from [5] ). The arrows indicate the position of read/write heads as a map from a set of finite-length strings (which are the programs to be written on the input tape) to the set of finite- or infinite-length strings (retrieved from the output tape). In this representation, the work tape is not considered, as it is used only internally to the machine. In this context, when considering a binary Turing machine ( binary computing machine [4] ), the set of functions f : {0, 1} {0, 1} {0, 1} computable by a Turing macine is called the set of partial recursive functions. 2) Church thesis: Church thesis (also known as Church-Turing thesis), is a statement about the nature of computable (meaning computable by a machine) functions. Just like for the definition of the Turing machine, many equivalent formulations of the thesis exist; the most known version of it is due to Kleene and it goes: Every computable numerical partial function is partial recursive. One of the main consequences of the thesis is that there exists a universal Turing machine able to simulate (with proper emulation instructions provided at the beginning of its input tape) any other Turing machine. The next section shows that this result is crucial, since the difficulty of describing an object is defined in terms of the length of the shortest program (i.e. sequence of input symbols) needed for the universal Turing machine to halt having returned the wanted string on its output tape. II. MEASURING COMPLEXITY As already stated, the concept of algorithmic complexity emerged from three different authors around Each of these authors (Chaitin, Solomonoff and Kolmogorov) comes to the definition of complexity from a different background and using different notations. In particular, Chaitin [4] tries to apply information-theoretic and probabilistic ideas to recursive function theory; Solomonoff [12,13] comes to defining complexity when trying to obtain the prior probability of strings under an inductive inference point of view; Kolmogorov [7] directly proposes a novel algorithmic way of quantitatively defining information. It is very interesting to see how the three of them converged to the same results when starting from very different areas of research. The following tractation will try to unify the notation from the three sources, in order to provide a coherent and understandable overview of the main ideas in the theory. A. Definitions 1) Kolmogorov complexity: Given a finite-length binary string x of length l(x), denoting a univeral Turing machine as M and writing M(P ) to indicate the output of M when executing the program P (which, in turn, is a finite binary sequence of length l(p )), we can define 1 Kolmogorov (or algorithmic) complexity as the minimum length for a program that, evaluated by M, makes it return x as the output and then halt. 1 This same definition can be found, even if with different notations, in all of [4], [12,13], [7].

3 3 Definition 1 (Kolmogorov complexity): K M (x) min l(p ) (1) P :M(P )=x An interesting interpretation of K M (x) is given by Solomonoff [12], where the a priori probability of a binary string x is defined as 2 K M(x). This shows that the algorithmic complexity can be used as a measure of randomness where the higher the complexity, the more random the string happens to be. To give the feeling of this concept, Solomonoff [13] states that this method of inductive inference (i.e. of finding the prior probability of a string) can be, in a sense, seen as an inversion of Huffman coding where the minimal code of the string is given and, from this code, it is possible to derive the probability of the string. Note that these operations can be performed without the need for the string to be ruled by a known probability distribution of any kind. 2) Conditional Kolmogorov complexity: It should be quite clear that for any program (i.e. algorithmic description) to correctly yield the wanted string x when processed by a universal Turing machine M, it needs to include information about the length of the string l(x), if this information is not already present in the machine. When this is the case, the definition of complexity provided in Definition 1 can be used. On the other hand, it is possible to consider the case when the length of the string is hardcoded in or separately given to the Turing machine used for the computation; in this case, the needed program will be shorter than the one including information about l(x) and the conditional Kolmogorov complexity [5,7] is defined as: Definition 2 (Conditional Kolmogorov complexity): K M (x l(x)) min l(p ) (2) P :M(P,l(x))=x 3) Examples of complexity: The definition of algorithmic complexity provides a really smart way of defining how difficult it is to define one object and it should appear clear that this is related to the randomness of the object and to how much information it can convey. To get an intuitive idea of how this works, a couple of examples of more or less complex objects follow. Consider the following objects chosen from the domain of alphanumeric strings: 1) KolmogorovKolmogorovKolmogorovKolmogorovKolmogorovKolmogorov 2) acj73fd3hw24f3dj4s6e2dsg457hew46fsda34701sths45fa554nary5782 It should be quite intuitive to see that the complexity of the second string is way greater than the complexity of the first one. Keeping in mind that any computable function, with proper transformations, may be calculated with a Turing machine, the following considerations will stay at a high level of abstraction, to give an intuitive feeling of the matter. The first string is simply the repetition of the word Kolmogorov for six times. So, its algorithmic complexity can be no greater than the length of the substring Kolmogorov plus a little more information about the length of the whole string. Its conditional complexity is even lower, since the information about the length of the string does not need to be included in the description. Considering that the overhead of describing an extension of this string (i.e. the repetition of the word Kolmogorov for n times, with n > 6) adds little to the length of the description that we need to provide, it is easy to see that the complexity of such a string, for n, is less than the length of the string. The second string looks quite random and, probably, there is no better description, in terms of an algorithm to derive it, than reporting the exact characters of the string, plus the overhead of having to indicate its length. In this case, the complexity of the string is greater than its length and the conditional complexity is about the same as the length of the string. Other interesting examples may be found when considering objects that can be efficiently described using mathematics. For instance, the binary encoding of π up to the n-th decimal digit can be algorithmically described with a program that has almost constant size for any n. Hence, this string of bits that could look pretty random at first glance has in fact a fairly low algorithmic complexity. The following definitions and theorems should give a formal shape to the intuitions provided by the above examples. 4) Algorithmic randomness and incompressibility: The notions of algorithmic complexity, as defined above, permit to define the conditions under which a string x = (x 1 x 2... x n ) of length n can be considered random. A definition of this concept of algorithmic randomness [3,5] follows:

4 4 Definition 3 (Algorithmic randomness): x is algorithmically random iff. K(x n) n (3) Definition 3 states that a string is to be considered random if its conditional Kolmogorov complexity (i.e. the length of its minimal description, given the length of the string) is greater than the length of the string itself. This makes sense when considering the examples of Section II-A3, where the string that looks less random happens to have an algorithmic complexity that is less than its length, while the opposite holds for the random string. The concept of conditional Kolmogorov complexity allows to define another property of a string x = (x 1 x 2... x n ), correlated with its algorithmic randomness. This property, defined below, is incompressibility [5] : Definition 4 (Incompressibility): K(x n) x is incompressible iff. lim = 1 (4) n n Roughly speaking, a string is incompressible if the length of its minimal description - given the length of the string - tends to be equal to the length of the string as n goes to infinity. This definition gives an interesting interpretation of Kolmogorov complexity in terms of how much a string can be compressed. In fact, a string with low Kolmogorov complexity can be computed by using a short program, which can be seen as a compressed version of the string. The original string can be reconstructed by running the program on an appropriate Turing machine, which plays the role of a decompressor. In the next section, some of the most interesting and meaningful results based on the definitions given above are presented. B. Main results The original papers that first introduced the concept of algorithmic complexity and successive work by the same authors and others provide a wide range of results based on the definitions provided in Section II-A. Probably, the most fundamental one is that Kolmogorov complexity is not computable, but it can be approximated from above by a computable process (hence, an upper bound exists, as shown below). The following tractation shows this and some other interesting results. 1) Universality: The first important result should look almost obvious after the brief discussion of computability theory provided in Section I-B and is presented in the following theorem [5] : Theorem 1 (Computer independence of Kolmogorov complexity): If U is a universal Turing machine, then for any other Turing machine (and, more generally, for any computing machine) A there exists a constant c A such that: K U (x) K A + c A The constant c A in the theorem is due to the overhead needed in the program for the universal Turing machine U to provide the instructions on how to emulate the other computer A and can be safely neglected when the length of x is big enough. Hence, due to the fact that there exists a universal Turing machine able to simulate any other computational machinery, the algorithmic complexity of an object is independent of any specific computer but for a constant term that becomes neglectable for l(x) = n. This result is really crucial to make Kolmogorov complexity a useful measure without the need to refer to a particular computing device. For the following two results, the field is restricted to binary strings and binary computers; see that this restriction does really not result in a loss of generality, since any string may be represented in a binary alphabet just by translating it with a proper encoding. Also, the notations K(x) and K(x l(x)) are used without specifying the specific Turing machine, thanks to the result shown in Theorem 1. 2) Upper bound on Kolmogorov complexity: An upper bound to the algorithmic complexity can be easily posed first for conditional complexity: Theorem 2 (Upper bound for conditional complexity): K(x l(x)) l(x) + c where c is a non negative constant. The proof is quite simple, as an effective program for obtaining a string x, when l(x) is known, is simply formed by the string itself with at most a constant overhead dependent on the computer the program is written for.

5 5 The above result can be extended to Kolmogorov complexity, when the length of the string is not known (i.e. a program to compute the string must be self-delimiting [3] ) as by adding a term that represents the inclusion of the information about l(x): Theorem 3 (Upper bound for Kolmogorov complexity): K(x) K(x l(x)) + 2 log l(x) + c The upper bound comes from the fact that a trivial method of making the program self-delimiting requires at most 2 log l(x) + c bits [5], but it can be refined by finding an optimized way to represent l(x) (i.e. by using iterated logarithms [4,5] ). 3) Lower bound on Kolmogorov complexity: A probably more interesting result is fixing a lower bound on algorithmic complexity or, more properly, a lower bound on the number of strings within a certain complexity. This is interesting because, by doing so, one can get a flavor of how likely it is for a string to be whether complex or simple to describe. Theorem 4 (Lower bound on Kolmogorov complexity [3,5] ): {x {0, 1} : K(x) k} < 2 k The theorem shows that there are not many string with low complexity and is simply proven by the fact that the number of binary programs of length less than k is 2 k 1 < 2 k. 4) Information-theoretic version of Gödel theorem: A more advanced result shown by Chaitin [3] is briefly reported here (without any claim of being axhaustive), as an aside to show how algorithmic information theory can provide an alternative of Gödel s famous theorem. Chaitin shows that, in an axiomatic theory, a lower bound n on the algorithmic complexity of a certain string of symbols defined in the theory can be established only if n is less than the algorithmic complexity of the axioms of the formal theory (i.e. the axioms used for the demonstration of the bound). This shows an inherent limitation of axiomatic theories that is assimilable to Gödel s incompleteness theorem by using an information theoretic argument based on Kolmogorov complexity. III. COMPLEXITY AND ENTROPY Algorithmic complexity and entropy are defined from very different backgrounds and, apparently, measure different aspects of an object. Despite of these apparent differences, it is possible to show that complexity and entropy are both a measure for the randomness of a string and, under some hipotheses, a relationship between the two can be proven, as shown in Theorem 5. Theorem 5 (Relation between Kolmogorov complexity and entropy [5,8] ): Let {X i } be a stochastic process drawn i.i.d. f(x), x X, X < and let f(x n ) = n i=1 f(x i). Under these conditions, it can be proven that [ ] 1 E n K(Xn n) H(X), as n. The above theorem shows that algorithmic complexity and entropy turn out to be very similar measures. Of course, this result holds only under the specified hypotheses, which ensure that both complexity and entropy are well defined. This result makes Kolmogorov complexity an even more meaningful tool since it is assimilable to entropy when both are defined, but can be used in a more general context where the probability distribution (or a good estimate) is not available for the studied object. IV. FURTHER DEVELOPMENTS Section II showed the main definitions and the most interesting results of the original theory of algorithmic complexity. In about ten years after the first papers (published in the mid 1960 s), the theory grew to a complete theoretical system and, in 1977, Chaitin published a review of what he called algorithmic information theory [3]. Beyond these direct developments of the theory, Kolmogorov complexity has been extended to very different areas. The following sections show some of these branches, giving an idea of how wide is the scope of the concept of algorithmic complexity.

6 6 A. Game interpretation One of the most fascinating of the areas where Kolmogorov complexity has been applied is Game theory. The link between the two theories is provided by Muchnick s theorem, which relates the theory of recursive sequences (which algorithmic complexity theory is based on) and game theory. In particular, this theorem associates every statement φ of recursion theory with a two-players game G φ with infinitely many moves represented by 0 s and 1 s. Considering this game, it is shown that if one of the players (called the Mathematician) has a computable winning strategy, then the statement φ is true, while if the other player (called Nature) has a computable winning strategy, then φ is false. Since Kolmogorov complexity is defined upon recursion theory, it is possible to exploit Muchnick s theorem to analyse the truth value of statements about the theory by the means of building a proper game, as defined in the theorem. This provides an interesting and unconventional point of view on the matter of Kolmogorov complexity and is extensively treated (with several examples) by Vereshchagin [15]. Other results in the field of algorithmic complexity are obtained with a game-theoretic approach by Muchnick and others in a very recent paper [10]. Here, again, it is shown that it is possible to prove statements about Kolmogorov complexity by constructing a special game and a winning strategy in the game. B. Quantum complexity The increasing interest in quantum computing encouraged some studies aimed at extending Kolmogorov complexity to the domain of quantum computation. The extension of this theory to quantum computing can be performed only by a prior redefinition of the relevant concepts of computation theory in this new area. Quite a lot of work has been done in this direction and Vitanyi gives an overview of the results in this field in a paper [16] which title recalls the original 1965 paper by Kolmogorov. In particular, it is shown that Kolmogorov complexity can be quite naturally extended based on quantum Turing machines and it can be used to describe the amount of information contained in a pure quantum state (i.e. a set of variables that fully describe a quantum system in probabilistic terms). One of the analogies of quantum algorithmic complexity and the classical version is that the former, as the latter, is upper bounded and it can be approximated from above by a computable process with arbitrarily small probability of error. C. Complexity and problem spaces One of the fields where an extension of the ideas of Kolmogorov complexity seems more natural is the one of complexity theory. There is an interesting work by Allender, Buhrman and Koucký [1] that investigates whether it is possible to characterize the complexity class PSPACE of problems solvable in polynomial space with a Turing machine by efficiently reducing it to the set R K of algorithmically random strings (as defined in Definition 3). The paper is pretty technical and leaves many open problems; the interesting fact here is that the ideas behind Kolmogorov complexity can be employed in trying a new approach to the study of problems in a very wide variety of research areas. V. APPLICATIONS Kolmogorov complexity has so far been extended and exploited in a number of different research areas, as is shown in Section IV. Beyond this, the concepts of this theory have been used in a variety of practical applications, where it helped to provide a unique approach to the analysis of real world problems. The following section illustrates a few recent applications of Kolmogorov complexity. A. Information assurance Kolmogorov complexity can be applied to retrieve important information about the state of a system; in particular, it has been used by Evans, Bush and Hershey [6] to design an approach to monitoring an information system against security flaws in both data and processes. In this work, the apparent complexity K(S) is defined as the best estimate of Kolmogorov complexity available to a party which is analysing the system and two metrics are proposed to evaluate the vulnerability of a process with input X and output Y : K(X.Y ) is defined as the complexity of the concatenated input and output of a process K(X Y ) represents the relative complexity of a process Based on this measures, the vulnerability of both processes and data is analysed, showing that the higher both the quantities are, the less vulnerable the system is. This result states, intuitively, that the more complex the

7 7 operations carried out by a certain process (from the point of view of a potential attacker), the harder it is for an attacker to understand what the process is doing in order to harm the system. Thanks to these results, a method to monitor the security level of a system, based on the ability of estimating the apparent complexity of the system to an attacker, is proposed and the conclusion is that Kolmogorov complexity can be a good candidate for further developments in this area. B. Spam filtering Another smart application of Kolmogorov complexity has been published by Spracklin and Saxton [14], who apply it in a spam filter. The underlying idea is that, when highlighting in a text the words linked with spam, their distribution will be quite random in normal mail, while it will obey some ordered criterion in spam. This idea is applied by encoding the text of an into a binary string where the words linked with spam are represented with a 1, while the other words are represented with a 0. Then, Kolmogorov complexity can be used to evaluate how random the resulting binary string is. The issue here is that Kolmogorov complexity is not computable; so, the idea is to apply a compression algorithm (run-lenght compression in the specific case) to get an estimate of Kolmogorov complexity. Based on this estimate, messages with high complexity are classified as non spam, while messages yielding a low complexity are identified as spam. The authors show that this approach ensures high accuracy (80% to 96%) and is much faster than other approaches used for the scope (for instance, Bayesian filters). C. Mental fatigue The capability of Kolmogorov complexity to be applied to any object encodable as a binary string permits to exploit it on virtually any data. For instance, Kolmogorov complexity is used in a medical context by Lian-yi and Chong-xun [9] to evaluate the level of mental fatigue of a person based on the signals coming from the EEG (Electroencephalogram). As in Section V-B [14], an estimate of Kolmogorov must be computed to be able to analyse the data; in this work, this is done by using a modified version of the Lempel-Ziv alogrithm applied to the discrete sequences representing the EEG signals. The results are encouraging, as it is shown that the estimate of Kolmogorov complexity of the EEG decreases as the mental fatigue increases. This indicates that the EEG signal is somehow less random when a person is in a state of mental fatigue. D. Complexity in machine learning Machine learning is an applied research field where randomness is often exploited in algorithms (for example, in the training algorithm of a neural network). One of the notions used in this field is known as Occam s razor (referring to the 14th-century English logician, theologian and Franciscan friar Father William of Ockham) and it can be - a little improperly - summarized as: the simplest explaination is the best. In machine learning terms, this means that the simpler the rules found to explain the training data, the better the generalization on the testing data. One problem in classical machine learning applications is the lack of a general and well founded way of determining which rules are the simplest ones for a given data set. Schmidhuber [11] discusses this issue and proposes a method based on Levin s complexity (which is a time-bounded extension of Kolmogorov complexity) to pose an ordering on the complexity of sets of rules and allow to efficiently choose the best ones. In this work, some experiments on simple problems (chosen to be computationally feasible) are described which show that the proposed method is able to yield solutions with generalization performance unmatchable by different training algorithms. VI. CONCLUSION Kolmogorov complexity is a fascinating concept which really captures how the complexity of an object can be described by scientific and rigorous means. This paper should have given an idea of how powerful and meaningful the ideas first introduced by Solomonoff, Chaitin and Kolmogorov are and how these ideas have been useful in exploring new possibilities in a great number of research areas and applications. Beyond the achieved results, many open problems remain and today the idea of Kolmogorov complexity can help the understanding of still unsolved complex problems. For instance, it is believed that there exist not fully understood strong links between the concept of Kolmogorov complexity and different areas of physics, from thermodinamics to black holes. As the bottomline, the field of Kolmogorov complexity is a mature but still very active research area, able to provide many useful results and answers, but leaving some appealing and challenging problems still to be solved.

8 8 REFERENCES [1] E. Allender, H. Buhrman, and M. Kouck, What can be efficiently reduced to the kolmogorov-random strings? Annals of Pure and Applied Logic, vol. 138, no. 1-3, pp. 2 19, (Not directly cited in text) [2] G. J. Chaitin, A theory of program size formally identical to information theory, (Not directly cited in text) [3], Algorithmic information theory, IBM JOURNAL OF RESEARCH AND DEVELOPMENT, vol. 21, pp , (Not directly cited in text) [4], On the length of programs for computing finite binary sequences: Statistical considerations, Journal of the ACM, vol. 13, pp , (Not directly cited in text) [5] T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. New York: Wiley, 2006, ch. 14. (Not directly cited in text) [6] S. Evans, S. Bush, and J. Hershey, Information assurance through kolmogorov complexity, DARPA Information Survivability Conference Exposition II, DISCEX 01. Proceedings, vol. 2, pp vol.2, (Not directly cited in text) [7] A. N. Kolmogorov, Three approaches to the quantitative definition of information, Problems of Information Transmission, vol. 1, no. 1, pp. 1 7, (Not directly cited in text) [8] S. Leung-Yan-Cheong and T. Cover, Some equivalences between shannon entropy and kolmogorov complexity, Information Theory, IEEE Transactions on, vol. 24, no. 3, pp , may (Not directly cited in text) [9] Z. Lian-yi and Z. Chong-xun, Analysis of kolmogorov complexity in spontaneous eeg signal and it s application to assessment of mental fatigue, Bioinformatics and Biomedical Engineering, ICBBE The 2nd International Conference on, pp , may (Not directly cited in text) [10] A. A. Muchnik, I. Mezhirov, A. Shen, and N. Vereshchagin, Game interpretation of kolmogorov complexity, ArXiv e-prints, mar (Not directly cited in text) [11] J. Schmidhuber, Discovering solutions with low kolmogorov complexity and high generalization capability, in Machine learning: proceedings of the twelfth international conference. Morgan Kaufmann Publishers, 1995, pp (Not directly cited in text) [12] R. Solomonoff, A preliminary report on a general theory of inductive inference, (Not directly cited in text) [13], A formal theory of inductive inference, part l, Information and Control, vol. 7, pp. 1 22, (Not directly cited in text) [14] L. Spracklin and L. Saxton, Filtering spam using kolmogorov complexity estimates, Advanced Information Networking and Applications Workshops, 2007, AINAW st International Conference on, vol. 1, pp , may (Not directly cited in text) [15] N. Vereshchagin, Kolmogorov complexity and games, (Not directly cited in text) [16] P. Vitanyi, Three approaches to the quantitative definition of information in an individual pure quantum state, Computational Complexity, Proceedings. 15th Annual IEEE Conference on, pp , (Not directly cited in text)

CISC 876: Kolmogorov Complexity

CISC 876: Kolmogorov Complexity March 27, 2007 Outline 1 Introduction 2 Definition Incompressibility and Randomness 3 Prefix Complexity Resource-Bounded K-Complexity 4 Incompressibility Method Gödel s Incompleteness Theorem 5 Outline

More information

Kolmogorov Complexity

Kolmogorov Complexity Kolmogorov Complexity 1 Krzysztof Zawada University of Illinois at Chicago E-mail: kzawada@uic.edu Abstract Information theory is a branch of mathematics that attempts to quantify information. To quantify

More information

Algorithmic Probability

Algorithmic Probability Algorithmic Probability From Scholarpedia From Scholarpedia, the free peer-reviewed encyclopedia p.19046 Curator: Marcus Hutter, Australian National University Curator: Shane Legg, Dalle Molle Institute

More information

COS597D: Information Theory in Computer Science October 19, Lecture 10

COS597D: Information Theory in Computer Science October 19, Lecture 10 COS597D: Information Theory in Computer Science October 9, 20 Lecture 0 Lecturer: Mark Braverman Scribe: Andrej Risteski Kolmogorov Complexity In the previous lectures, we became acquainted with the concept

More information

Is there an Elegant Universal Theory of Prediction?

Is there an Elegant Universal Theory of Prediction? Is there an Elegant Universal Theory of Prediction? Shane Legg Dalle Molle Institute for Artificial Intelligence Manno-Lugano Switzerland 17th International Conference on Algorithmic Learning Theory Is

More information

Complexity 6: AIT. Outline. Dusko Pavlovic. Kolmogorov. Solomonoff. Chaitin: The number of wisdom RHUL Spring Complexity 6: AIT.

Complexity 6: AIT. Outline. Dusko Pavlovic. Kolmogorov. Solomonoff. Chaitin: The number of wisdom RHUL Spring Complexity 6: AIT. Outline Complexity Theory Part 6: did we achieve? Algorithmic information and logical depth : Algorithmic information : Algorithmic probability : The number of wisdom RHUL Spring 2012 : Logical depth Outline

More information

Compression Complexity

Compression Complexity Compression Complexity Stephen Fenner University of South Carolina Lance Fortnow Georgia Institute of Technology February 15, 2017 Abstract The Kolmogorov complexity of x, denoted C(x), is the length of

More information

Most General computer?

Most General computer? Turing Machines Most General computer? DFAs are simple model of computation. Accept only the regular languages. Is there a kind of computer that can accept any language, or compute any function? Recall

More information

Introduction to Languages and Computation

Introduction to Languages and Computation Introduction to Languages and Computation George Voutsadakis 1 1 Mathematics and Computer Science Lake Superior State University LSSU Math 400 George Voutsadakis (LSSU) Languages and Computation July 2014

More information

Computability Theory

Computability Theory Computability Theory Cristian S. Calude May 2012 Computability Theory 1 / 1 Bibliography M. Sipser. Introduction to the Theory of Computation, PWS 1997. (textbook) Computability Theory 2 / 1 Supplementary

More information

Kolmogorov complexity

Kolmogorov complexity Kolmogorov complexity In this section we study how we can define the amount of information in a bitstring. Consider the following strings: 00000000000000000000000000000000000 0000000000000000000000000000000000000000

More information

Computability Theory. CS215, Lecture 6,

Computability Theory. CS215, Lecture 6, Computability Theory CS215, Lecture 6, 2000 1 The Birth of Turing Machines At the end of the 19th century, Gottlob Frege conjectured that mathematics could be built from fundamental logic In 1900 David

More information

Kolmogorov complexity and its applications

Kolmogorov complexity and its applications CS860, Winter, 2010 Kolmogorov complexity and its applications Ming Li School of Computer Science University of Waterloo http://www.cs.uwaterloo.ca/~mli/cs860.html We live in an information society. Information

More information

Handouts. CS701 Theory of Computation

Handouts. CS701 Theory of Computation Handouts CS701 Theory of Computation by Kashif Nadeem VU Student MS Computer Science LECTURE 01 Overview In this lecturer the topics will be discussed including The Story of Computation, Theory of Computation,

More information

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity Eckehard Olbrich MPI MiS Leipzig Potsdam WS 2007/08 Olbrich (Leipzig) 26.10.2007 1 / 18 Overview 1 Summary

More information

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY 15-453 FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY KOLMOGOROV-CHAITIN (descriptive) COMPLEXITY TUESDAY, MAR 18 CAN WE QUANTIFY HOW MUCH INFORMATION IS IN A STRING? A = 01010101010101010101010101010101

More information

Algorithmic Information Theory

Algorithmic Information Theory Algorithmic Information Theory [ a brief non-technical guide to the field ] Marcus Hutter RSISE @ ANU and SML @ NICTA Canberra, ACT, 0200, Australia marcus@hutter1.net www.hutter1.net March 2007 Abstract

More information

2 Plain Kolmogorov Complexity

2 Plain Kolmogorov Complexity 2 Plain Kolmogorov Complexity In this section, we introduce plain Kolmogorov Complexity, prove the invariance theorem - that is, the complexity of a string does not depend crucially on the particular model

More information

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 15 Ana Bove May 17th 2018 Recap: Context-free Languages Chomsky hierarchy: Regular languages are also context-free; Pumping lemma

More information

Entropy Rate of Stochastic Processes

Entropy Rate of Stochastic Processes Entropy Rate of Stochastic Processes Timo Mulder tmamulder@gmail.com Jorn Peters jornpeters@gmail.com February 8, 205 The entropy rate of independent and identically distributed events can on average be

More information

Lecture notes on Turing machines

Lecture notes on Turing machines Lecture notes on Turing machines Ivano Ciardelli 1 Introduction Turing machines, introduced by Alan Turing in 1936, are one of the earliest and perhaps the best known model of computation. The importance

More information

Gödel s Incompleteness Theorem. Overview. Computability and Logic

Gödel s Incompleteness Theorem. Overview. Computability and Logic Gödel s Incompleteness Theorem Overview Computability and Logic Recap Remember what we set out to do in this course: Trying to find a systematic method (algorithm, procedure) which we can use to decide,

More information

CS187 - Science Gateway Seminar for CS and Math

CS187 - Science Gateway Seminar for CS and Math CS187 - Science Gateway Seminar for CS and Math Fall 2013 Class 3 Sep. 10, 2013 What is (not) Computer Science? Network and system administration? Playing video games? Learning to use software packages?

More information

Sophistication Revisited

Sophistication Revisited Sophistication Revisited Luís Antunes Lance Fortnow August 30, 007 Abstract Kolmogorov complexity measures the ammount of information in a string as the size of the shortest program that computes the string.

More information

CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression

CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression Rosencrantz & Guildenstern Are Dead (Tom Stoppard) Rigged Lottery? And the winning numbers are: 1, 2, 3, 4, 5, 6 But is

More information

Kolmogorov complexity and its applications

Kolmogorov complexity and its applications Spring, 2009 Kolmogorov complexity and its applications Paul Vitanyi Computer Science University of Amsterdam http://www.cwi.nl/~paulv/course-kc We live in an information society. Information science is

More information

Computational Complexity

Computational Complexity p. 1/24 Computational Complexity The most sharp distinction in the theory of computation is between computable and noncomputable functions; that is, between possible and impossible. From the example of

More information

Limitations of Efficient Reducibility to the Kolmogorov Random Strings

Limitations of Efficient Reducibility to the Kolmogorov Random Strings Limitations of Efficient Reducibility to the Kolmogorov Random Strings John M. HITCHCOCK 1 Department of Computer Science, University of Wyoming Abstract. We show the following results for polynomial-time

More information

Lecture 13: Foundations of Math and Kolmogorov Complexity

Lecture 13: Foundations of Math and Kolmogorov Complexity 6.045 Lecture 13: Foundations of Math and Kolmogorov Complexity 1 Self-Reference and the Recursion Theorem 2 Lemma: There is a computable function q : Σ* Σ* such that for every string w, q(w) is the description

More information

Limits of Computation. Antonina Kolokolova

Limits of Computation. Antonina Kolokolova Limits of Computation Antonina Kolokolova What is computation? What is information? What is learning? Are there any limits of our ability to solve problems? Theoretical Computer Science Is there a perfect

More information

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem.

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem. 1 More on NP In this set of lecture notes, we examine the class NP in more detail. We give a characterization of NP which justifies the guess and verify paradigm, and study the complexity of solving search

More information

Chapter 6 The Structural Risk Minimization Principle

Chapter 6 The Structural Risk Minimization Principle Chapter 6 The Structural Risk Minimization Principle Junping Zhang jpzhang@fudan.edu.cn Intelligent Information Processing Laboratory, Fudan University March 23, 2004 Objectives Structural risk minimization

More information

DRAFT. Diagonalization. Chapter 4

DRAFT. Diagonalization. Chapter 4 Chapter 4 Diagonalization..the relativized P =?NP question has a positive answer for some oracles and a negative answer for other oracles. We feel that this is further evidence of the difficulty of the

More information

CS 275 Automata and Formal Language Theory

CS 275 Automata and Formal Language Theory CS 275 Automata and Formal Language Theory Course Notes Part III: Limits of Computation Chapter III.1: Introduction Anton Setzer http://www.cs.swan.ac.uk/ csetzer/lectures/ automataformallanguage/current/index.html

More information

Recovery Based on Kolmogorov Complexity in Underdetermined Systems of Linear Equations

Recovery Based on Kolmogorov Complexity in Underdetermined Systems of Linear Equations Recovery Based on Kolmogorov Complexity in Underdetermined Systems of Linear Equations David Donoho Department of Statistics Stanford University Email: donoho@stanfordedu Hossein Kakavand, James Mammen

More information

Notational conventions

Notational conventions CHAPTER 0 Notational conventions We now specify some of the notations and conventions used throughout this book. We make use of some notions from discrete mathematics such as strings, sets, functions,

More information

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever.   ETH Zürich (D-ITET) October, Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu ETH Zürich (D-ITET) October, 19 2017 Part 5 out of 5 Last week was all about Context-Free Languages Context-Free

More information

Q = Set of states, IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar

Q = Set of states, IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar Turing Machine A Turing machine is an abstract representation of a computing device. It consists of a read/write

More information

,

, Kolmogorov Complexity Carleton College, CS 254, Fall 2013, Prof. Joshua R. Davis based on Sipser, Introduction to the Theory of Computation 1. Introduction Kolmogorov complexity is a theory of lossless

More information

Production-rule complexity of recursive structures

Production-rule complexity of recursive structures Production-rule complexity of recursive structures Konstantin L Kouptsov New York University klk206@panix.com Complex recursive structures, such as fractals, are often described by sets of production rules,

More information

Lecture 14 Rosser s Theorem, the length of proofs, Robinson s Arithmetic, and Church s theorem. Michael Beeson

Lecture 14 Rosser s Theorem, the length of proofs, Robinson s Arithmetic, and Church s theorem. Michael Beeson Lecture 14 Rosser s Theorem, the length of proofs, Robinson s Arithmetic, and Church s theorem Michael Beeson The hypotheses needed to prove incompleteness The question immediate arises whether the incompleteness

More information

Introduction to Kolmogorov Complexity

Introduction to Kolmogorov Complexity Introduction to Kolmogorov Complexity Marcus Hutter Canberra, ACT, 0200, Australia http://www.hutter1.net/ ANU Marcus Hutter - 2 - Introduction to Kolmogorov Complexity Abstract In this talk I will give

More information

Automata Theory. Definition. Computational Complexity Theory. Computability Theory

Automata Theory. Definition. Computational Complexity Theory. Computability Theory Outline THEORY OF COMPUTATION CS363, SJTU What is Theory of Computation? History of Computation Branches and Development Xiaofeng Gao Dept. of Computer Science Shanghai Jiao Tong University 2 The Essential

More information

Applied Logic. Lecture 4 part 2 Bayesian inductive reasoning. Marcin Szczuka. Institute of Informatics, The University of Warsaw

Applied Logic. Lecture 4 part 2 Bayesian inductive reasoning. Marcin Szczuka. Institute of Informatics, The University of Warsaw Applied Logic Lecture 4 part 2 Bayesian inductive reasoning Marcin Szczuka Institute of Informatics, The University of Warsaw Monographic lecture, Spring semester 2017/2018 Marcin Szczuka (MIMUW) Applied

More information

CS 275 Automata and Formal Language Theory

CS 275 Automata and Formal Language Theory CS 275 Automata and Formal Language Theory Course Notes Part III: Limits of Computation Chapt. III.1: Introduction Anton Setzer http://www.cs.swan.ac.uk/ csetzer/lectures/ automataformallanguage/current/index.html

More information

Models. Models of Computation, Turing Machines, and the Limits of Turing Computation. Effective Calculability. Motivation for Models of Computation

Models. Models of Computation, Turing Machines, and the Limits of Turing Computation. Effective Calculability. Motivation for Models of Computation Turing Computation /0/ Models of Computation, Turing Machines, and the Limits of Turing Computation Bruce MacLennan Models A model is a tool intended to address a class of questions about some domain of

More information

Gödel, Turing, the Undecidability Results and the Nature of Human Mind

Gödel, Turing, the Undecidability Results and the Nature of Human Mind Gödel, Turing, the Undecidability Results and the Nature of Human Mind Riccardo Bruni riccardobruni@hotmail.com Dipartimento di Filosofia Università degli Studi di Firenze (Italy) Computability in Europe

More information

Large Numbers, Busy Beavers, Noncomputability and Incompleteness

Large Numbers, Busy Beavers, Noncomputability and Incompleteness Large Numbers, Busy Beavers, Noncomputability and Incompleteness Food For Thought November 1, 2007 Sam Buss Department of Mathematics U.C. San Diego PART I Large Numbers, Busy Beavers, and Undecidability

More information

The Legacy of Hilbert, Gödel, Gentzen and Turing

The Legacy of Hilbert, Gödel, Gentzen and Turing The Legacy of Hilbert, Gödel, Gentzen and Turing Amílcar Sernadas Departamento de Matemática - Instituto Superior Técnico Security and Quantum Information Group - Instituto de Telecomunicações TULisbon

More information

Announcements. Problem Set 6 due next Monday, February 25, at 12:50PM. Midterm graded, will be returned at end of lecture.

Announcements. Problem Set 6 due next Monday, February 25, at 12:50PM. Midterm graded, will be returned at end of lecture. Turing Machines Hello Hello Condensed Slide Slide Readers! Readers! This This lecture lecture is is almost almost entirely entirely animations that that show show how how each each Turing Turing machine

More information

COMPARATIVE ANALYSIS ON TURING MACHINE AND QUANTUM TURING MACHINE

COMPARATIVE ANALYSIS ON TURING MACHINE AND QUANTUM TURING MACHINE Volume 3, No. 5, May 2012 Journal of Global Research in Computer Science REVIEW ARTICLE Available Online at www.jgrcs.info COMPARATIVE ANALYSIS ON TURING MACHINE AND QUANTUM TURING MACHINE Tirtharaj Dash

More information

Universal probability distributions, two-part codes, and their optimal precision

Universal probability distributions, two-part codes, and their optimal precision Universal probability distributions, two-part codes, and their optimal precision Contents 0 An important reminder 1 1 Universal probability distributions in theory 2 2 Universal probability distributions

More information

1 Computational problems

1 Computational problems 80240233: Computational Complexity Lecture 1 ITCS, Tsinghua Univesity, Fall 2007 9 October 2007 Instructor: Andrej Bogdanov Notes by: Andrej Bogdanov The aim of computational complexity theory is to study

More information

Theory of Computation Lecture Notes. Problems and Algorithms. Class Information

Theory of Computation Lecture Notes. Problems and Algorithms. Class Information Theory of Computation Lecture Notes Prof. Yuh-Dauh Lyuu Dept. Computer Science & Information Engineering and Department of Finance National Taiwan University Problems and Algorithms c 2004 Prof. Yuh-Dauh

More information

Computation. Some history...

Computation. Some history... Computation Motivating questions: What does computation mean? What are the similarities and differences between computation in computers and in natural systems? What are the limits of computation? Are

More information

Algorithmic Information Theory

Algorithmic Information Theory Algorithmic Information Theory Peter D. Grünwald CWI, P.O. Box 94079 NL-1090 GB Amsterdam, The Netherlands E-mail: pdg@cwi.nl Paul M.B. Vitányi CWI, P.O. Box 94079 NL-1090 GB Amsterdam The Netherlands

More information

Algorithmic probability, Part 1 of n. A presentation to the Maths Study Group at London South Bank University 09/09/2015

Algorithmic probability, Part 1 of n. A presentation to the Maths Study Group at London South Bank University 09/09/2015 Algorithmic probability, Part 1 of n A presentation to the Maths Study Group at London South Bank University 09/09/2015 Motivation Effective clustering the partitioning of a collection of objects such

More information

Part I: Definitions and Properties

Part I: Definitions and Properties Turing Machines Part I: Definitions and Properties Finite State Automata Deterministic Automata (DFSA) M = {Q, Σ, δ, q 0, F} -- Σ = Symbols -- Q = States -- q 0 = Initial State -- F = Accepting States

More information

Gödel s Incompleteness Theorem. Overview. Computability and Logic

Gödel s Incompleteness Theorem. Overview. Computability and Logic Gödel s Incompleteness Theorem Overview Computability and Logic Recap Remember what we set out to do in this course: Trying to find a systematic method (algorithm, procedure) which we can use to decide,

More information

3 Self-Delimiting Kolmogorov complexity

3 Self-Delimiting Kolmogorov complexity 3 Self-Delimiting Kolmogorov complexity 3. Prefix codes A set is called a prefix-free set (or a prefix set) if no string in the set is the proper prefix of another string in it. A prefix set cannot therefore

More information

IV. Turing Machine. Yuxi Fu. BASICS, Shanghai Jiao Tong University

IV. Turing Machine. Yuxi Fu. BASICS, Shanghai Jiao Tong University IV. Turing Machine Yuxi Fu BASICS, Shanghai Jiao Tong University Alan Turing Alan Turing (23Jun.1912-7Jun.1954), an English student of Church, introduced a machine model for effective calculation in On

More information

Introduction to Logic and Axiomatic Set Theory

Introduction to Logic and Axiomatic Set Theory Introduction to Logic and Axiomatic Set Theory 1 Introduction In mathematics, we seek absolute rigor in our arguments, and a solid foundation for all of the structures we consider. Here, we will see some

More information

Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness

Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness Harry Lewis November 19, 2013 Reading: Sipser 7.4, 7.5. For culture : Computers and Intractability: A Guide to the Theory

More information

Distribution of Environments in Formal Measures of Intelligence: Extended Version

Distribution of Environments in Formal Measures of Intelligence: Extended Version Distribution of Environments in Formal Measures of Intelligence: Extended Version Bill Hibbard December 2008 Abstract This paper shows that a constraint on universal Turing machines is necessary for Legg's

More information

The purpose here is to classify computational problems according to their complexity. For that purpose we need first to agree on a computational

The purpose here is to classify computational problems according to their complexity. For that purpose we need first to agree on a computational 1 The purpose here is to classify computational problems according to their complexity. For that purpose we need first to agree on a computational model. We'll remind you what a Turing machine is --- you

More information

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky. Lecture 4

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky. Lecture 4 CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky Lecture 4 Lecture date: January 26, 2005 Scribe: Paul Ray, Mike Welch, Fernando Pereira 1 Private Key Encryption Consider a game between

More information

Decidable Languages - relationship with other classes.

Decidable Languages - relationship with other classes. CSE2001, Fall 2006 1 Last time we saw some examples of decidable languages (or, solvable problems). Today we will start by looking at the relationship between the decidable languages, and the regular and

More information

Notes on Complexity Theory Last updated: December, Lecture 2

Notes on Complexity Theory Last updated: December, Lecture 2 Notes on Complexity Theory Last updated: December, 2011 Jonathan Katz Lecture 2 1 Review The running time of a Turing machine M on input x is the number of steps M takes before it halts. Machine M is said

More information

What are the recursion theoretic properties of a set of axioms? Understanding a paper by William Craig Armando B. Matos

What are the recursion theoretic properties of a set of axioms? Understanding a paper by William Craig Armando B. Matos What are the recursion theoretic properties of a set of axioms? Understanding a paper by William Craig Armando B. Matos armandobcm@yahoo.com February 5, 2014 Abstract This note is for personal use. It

More information

Introduction to Turing Machines

Introduction to Turing Machines Introduction to Turing Machines Deepak D Souza Department of Computer Science and Automation Indian Institute of Science, Bangalore. 12 November 2015 Outline 1 Turing Machines 2 Formal definitions 3 Computability

More information

Theory of Computation Lecture 1. Dr. Nahla Belal

Theory of Computation Lecture 1. Dr. Nahla Belal Theory of Computation Lecture 1 Dr. Nahla Belal Book The primary textbook is: Introduction to the Theory of Computation by Michael Sipser. Grading 10%: Weekly Homework. 30%: Two quizzes and one exam. 20%:

More information

Kolmogorov structure functions for automatic complexity

Kolmogorov structure functions for automatic complexity Kolmogorov structure functions for automatic complexity Bjørn Kjos-Hanssen June 16, 2015 Varieties of Algorithmic Information, University of Heidelberg Internationales Wissenschaftssentrum History 1936:

More information

6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3

6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3 6.841/18.405J: Advanced Complexity Wednesday, February 12, 2003 Lecture Lecture 3 Instructor: Madhu Sudan Scribe: Bobby Kleinberg 1 The language MinDNF At the end of the last lecture, we introduced the

More information

Artificial Intelligence. 3 Problem Complexity. Prof. Dr. Jana Koehler Fall 2016 HSLU - JK

Artificial Intelligence. 3 Problem Complexity. Prof. Dr. Jana Koehler Fall 2016 HSLU - JK Artificial Intelligence 3 Problem Complexity Prof. Dr. Jana Koehler Fall 2016 Agenda Computability and Turing Machines Tractable and Intractable Problems P vs. NP Decision Problems Optimization problems

More information

Turing Machines and the Church-Turing Thesis

Turing Machines and the Church-Turing Thesis CSE2001, Fall 2006 1 Turing Machines and the Church-Turing Thesis Today our goal is to show that Turing Machines are powerful enough to model digital computers, and to see discuss some evidence for the

More information

HUMAN COMPUTATION FROM A STRICTLY DYNAMICAL POINT OF VIEW

HUMAN COMPUTATION FROM A STRICTLY DYNAMICAL POINT OF VIEW Cagliari Colloquium on the Extended Mind, Dynamicism, and Computation. Cagliari, June 10, 2013 HUMAN COMPUTATION FROM A STRICTLY DYNAMICAL POINT OF VIEW Marco Giunti - ALOPHIS, Università di Cagliari SUMMARY

More information

Complexity Theory Turing Machines

Complexity Theory Turing Machines Complexity Theory Turing Machines Joseph Spring Department of Computer Science 3COM0074 - Quantum Computing / QIP QC - Lecture 2 1 Areas for Discussion Algorithms Complexity Theory and Computing Models

More information

On the Computational Hardness of Graph Coloring

On the Computational Hardness of Graph Coloring On the Computational Hardness of Graph Coloring Steven Rutherford June 3, 2011 Contents 1 Introduction 2 2 Turing Machine 2 3 Complexity Classes 3 4 Polynomial Time (P) 4 4.1 COLORED-GRAPH...........................

More information

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal

More information

Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM)

Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM) Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM) 1 Deterministic Turing Machine (DTM).. B B 0 1 1 0 0 B B.. Finite Control Two-way, infinite tape, broken into

More information

Turing Machines, diagonalization, the halting problem, reducibility

Turing Machines, diagonalization, the halting problem, reducibility Notes on Computer Theory Last updated: September, 015 Turing Machines, diagonalization, the halting problem, reducibility 1 Turing Machines A Turing machine is a state machine, similar to the ones we have

More information

Turing Machines. Lecture 8

Turing Machines. Lecture 8 Turing Machines Lecture 8 1 Course Trajectory We will see algorithms, what can be done. But what cannot be done? 2 Computation Problem: To compute a function F that maps each input (a string) to an output

More information

Computer Sciences Department

Computer Sciences Department Computer Sciences Department 1 Reference Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Computer Sciences Department 3 ADVANCED TOPICS IN C O M P U T A B I L I T Y

More information

An Algebraic Characterization of the Halting Probability

An Algebraic Characterization of the Halting Probability CDMTCS Research Report Series An Algebraic Characterization of the Halting Probability Gregory Chaitin IBM T. J. Watson Research Center, USA CDMTCS-305 April 2007 Centre for Discrete Mathematics and Theoretical

More information

Limits of Computation

Limits of Computation The real danger is not that computers will begin to think like men, but that men will begin to think like computers Limits of Computation - Sydney J. Harris What makes you believe now that I am just talking

More information

MODULE -4 BAYEIAN LEARNING

MODULE -4 BAYEIAN LEARNING MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities

More information

258 Handbook of Discrete and Combinatorial Mathematics

258 Handbook of Discrete and Combinatorial Mathematics 258 Handbook of Discrete and Combinatorial Mathematics 16.3 COMPUTABILITY Most of the material presented here is presented in far more detail in the texts of Rogers [R], Odifreddi [O], and Soare [S]. In

More information

Complexity Theory Part I

Complexity Theory Part I Complexity Theory Part I Outline for Today Recap from Last Time Reviewing Verifiers Nondeterministic Turing Machines What does nondeterminism mean in the context of TMs? And just how powerful are NTMs?

More information

TURING MAHINES

TURING MAHINES 15-453 TURING MAHINES TURING MACHINE FINITE STATE q 10 CONTROL AI N P U T INFINITE TAPE read write move 0 0, R, R q accept, R q reject 0 0, R 0 0, R, L read write move 0 0, R, R q accept, R 0 0, R 0 0,

More information

Undecidable Problems. Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, / 65

Undecidable Problems. Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, / 65 Undecidable Problems Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, 2018 1/ 65 Algorithmically Solvable Problems Let us assume we have a problem P. If there is an algorithm solving

More information

Lecture Notes on Inductive Definitions

Lecture Notes on Inductive Definitions Lecture Notes on Inductive Definitions 15-312: Foundations of Programming Languages Frank Pfenning Lecture 2 September 2, 2004 These supplementary notes review the notion of an inductive definition and

More information

Pure Quantum States Are Fundamental, Mixtures (Composite States) Are Mathematical Constructions: An Argument Using Algorithmic Information Theory

Pure Quantum States Are Fundamental, Mixtures (Composite States) Are Mathematical Constructions: An Argument Using Algorithmic Information Theory Pure Quantum States Are Fundamental, Mixtures (Composite States) Are Mathematical Constructions: An Argument Using Algorithmic Information Theory Vladik Kreinovich and Luc Longpré Department of Computer

More information

Introduction to Turing Machines. Reading: Chapters 8 & 9

Introduction to Turing Machines. Reading: Chapters 8 & 9 Introduction to Turing Machines Reading: Chapters 8 & 9 1 Turing Machines (TM) Generalize the class of CFLs: Recursively Enumerable Languages Recursive Languages Context-Free Languages Regular Languages

More information

where Q is a finite set of states

where Q is a finite set of states Space Complexity So far most of our theoretical investigation on the performances of the various algorithms considered has focused on time. Another important dynamic complexity measure that can be associated

More information

Prefix-like Complexities and Computability in the Limit

Prefix-like Complexities and Computability in the Limit Prefix-like Complexities and Computability in the Limit Alexey Chernov 1 and Jürgen Schmidhuber 1,2 1 IDSIA, Galleria 2, 6928 Manno, Switzerland 2 TU Munich, Boltzmannstr. 3, 85748 Garching, München, Germany

More information

Theory of Computing Tamás Herendi

Theory of Computing Tamás Herendi Theory of Computing Tamás Herendi Theory of Computing Tamás Herendi Publication date 2014 Table of Contents 1 Preface 1 2 Formal languages 2 3 Order of growth rate 9 4 Turing machines 16 1 The definition

More information

CS151 Complexity Theory. Lecture 1 April 3, 2017

CS151 Complexity Theory. Lecture 1 April 3, 2017 CS151 Complexity Theory Lecture 1 April 3, 2017 Complexity Theory Classify problems according to the computational resources required running time storage space parallelism randomness rounds of interaction,

More information

arxiv: v1 [cs.it] 17 Sep 2017

arxiv: v1 [cs.it] 17 Sep 2017 Kolmogorov Complexity and Information Content arxiv:1710.06846v1 [cs.it] 17 Sep 2017 Fouad B. Chedid Abstract In this paper, we revisit a central concept in Kolmogorov complexity in which one would equate

More information

Griffith University 3130CIT Theory of Computation (Based on slides by Harald Søndergaard of The University of Melbourne) Turing Machines 9-0

Griffith University 3130CIT Theory of Computation (Based on slides by Harald Søndergaard of The University of Melbourne) Turing Machines 9-0 Griffith University 3130CIT Theory of Computation (Based on slides by Harald Søndergaard of The University of Melbourne) Turing Machines 9-0 Turing Machines Now for a machine model of much greater power.

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information