Kolmogorov Complexity

Size: px
Start display at page:

Download "Kolmogorov Complexity"

Transcription

1 Kolmogorov Complexity 1 Krzysztof Zawada University of Illinois at Chicago kzawada@uic.edu Abstract Information theory is a branch of mathematics that attempts to quantify information. To quantify information one needs to look at the data compression and transmission rate. Information theory has applications in many fields. The purpose of this paper is comprehensively present a subset of information theory as it applies to computer science. Andrei Kolmogorov was the pioneering mathematician that applied information theory to computer science. The branch of information theory that relates to computer science became known as Kolmogorov complexity or algorithmic complexity. I. INTRODUCTION In 1965 Andrei Kolmogorov a researcher and mathematician presented a definition for the descriptive complexity of an object. He stated that the complexity of an object is the length of the shortest binary computer program that describes the object. Kolmogorov complexity or algorithmic complexity studies the complexity of strings and other data structures. Although he was the pioneer in this area other researchers such as Solmonoff and Chaitin contributed to this idea in this time frame (1960s-1970s). To measure the computational resources to specify an object in the real world, an embedded system is one example. Embedded systems use hardware and software to accomplish something specific for an application. Embedded systems use messaging schemes, data structures, and algorithms that define information and transmit this information reliably. There is a high level of complexity that produces something deterministic (real-time) using some computational resources. Kolmogorov complexity is an abstract and deep idea and it can be used to state or prove many impossibility results, statements that are true but cannot be proven. In essence there is no way to actually solve or find the shortest computer program for a given string in practice because this might take an infinite amount of time. Kolmogorov complexity is considered as a way of thinking and a good basis for inductive reference to fundamental understanding of computer science as well as other areas such as physics and communication theory. This paper will present the basic idea, author contributions, and new developments in the area. II. KOLMOGOROV COMPLEXITY DEFINITION AND EXPLANATION To understand and explain the concept or of Kolmogorov complexity a few classic examples are used. Since it is a study of complexity of strings and other data structures consider the examples of strings in figure 1. Kolmogorov complexity attempts to answer the following questions: (a) What is the length of shortest description of the above strings or sequences? (b) What is the shortest binary computer program for the above string or sequences? (c) What is the minimum amount of bits to describe the sequences or strings above? Analyzing the string sequences above it is easy to see that each has a length and an English language description. Some can be described easily, but others are not described easily. In fact, the only way to

2 2 Fig. 1. Kolmogorov Examples describe the more complex and random strings is by the giving the string itself. To find a shortest computer program that could generate the above strings would involve CPU, memory, input/output, encoding (ASCII), and a description language such as assembly, C/C++ or Java Virtual Machine byte code. Even though this would be something to consider Kolmogorov showed that the definition of complexity is computer independent. Thomas and Cover[6] explain this and define the Kolmogorov complexity of a string with a universal computer as the following: K U (x) = min p:u(p)=x l(p) Where, x represents a string, U is a universal computer, and p is the program. This equation states thek U (x) is the shortest description (program) length of string x interpreted by a universal computer. The universal computer is called a Turing machine, developed by Alan Turing to be the simplest universal computer. It is a widely used computational system for analysis that simplifies a computer. In a Turing machine, input tape is a binary program that is fed to the computer. The computer uses a state machine to interpret the instructions and produce a binary output on another tape. This in fact is a simple and a good model to represent a digital computer for academic analysis. Mathematically it is represented by the following formula: K U (x) K A (x) + c A K U (x) K A (x) < c A is a computer with a constant (c A ) that does not depend on the string length x. The constant c A in this equation account for the difference in using a univeral computer versus computer A which may have large instruction set or function. The idea that the universal computer has a program with a specific minimal length that could generate a string or sequence was further developed. The development stated that the program (or the minimal description) could not be much larger than the string itself. The proof of this is provided by Thomas and Cover[6]. Using conditional Kolmogorov complexity, with the precondition that the string is some known length. It relates the length of the string to the minimum program length as follows: K(x l(x)) l(x) + c K(x l(x)) K(x l(x)) + 2 log l(x) + c (upper bound)

3 This means that there is an upper bound on the size of the minimal description (or program). There also exists a lower bound on this minimal description. It is defined for a prefix-free set that satisfies Kraft inequality by Thomas and Cover[6] as: f(x n )K(x n n) H(X 1, X 2, X 3,..., X n ) = nh(x) 3 The relationship of Kolmogorov complexity to information theory is through entropy. The entropy is the expected length of the shortest binary computer description. It is defined by: H 1 log n n EK(Xn n) H + X n + c n Chain rule in information theory also has an analog in Kolmogorov complexity and it is defined by: K(X, Y ) = K(X) + K(Y X) + O(log(K(X, Y )) Kolmogorov complexity has many other applications that can also be applied to find bounds. For instance, image compression; the measure of the algorithmic complexity of an image with n pixels that is compressed by some factor f can be defined by K(image n) ( n f ) + c. Another example is gambling, where it can be shown that the logarithm of wealth a gambler achieves on a sequence is no smaller than length of the sequence [6]. This idea also extends to finding the complexity of integers, algorithmically random and incompressible sequences, universal probability, statistics, and other applications. Although from the given examples we know that Kolmogorov complexity cannot be computed, the outcome converges to the true value and this provides a framework to study randomness and inference. The main contributors to the original idea are Kolmogorov, Chaitin, and Solomonff. Their contributions are given next. A. Andrei Kolmogorov III. ORIGINAL PAPERS The original ideas of complexity were published in two main papers. The first was in 1965 titled Three approaches to the quantitative definitions of information [11]. In this paper Kolmogorov presented the combinatorial approach, probabilistic approach, and the algorithmic approach to define information. He stated that the two common approaches were the combinatorial and probabilistic. When describing the combinatorial method the author used sets of elements, language alphabets, and Russian literary text as examples in deriving the entropy, mutual information, and bounds on information. Then he applied meaning to these outcomes. With the probabilistic approach Kolmogorov focused on random variables with a given probabilistic distribution. He showed that the mutual information I(X; Y ) is non-negative in the combinatorial approach, but the probabilistic approach shows it could be negative and that the average quantity I W (X, Y ) was the true quantity of information content. The probabilistic approach was identified to be more applicable to communication channels with many weakly related messages. The last topic in his paper was the algorithmic approach which used the theory of recursive functions. To describe the length of a string x (l(x) = n), he proposed to do it recursively in log n + log log n + log log log n + bits, and continue this recursively until the last positive term. This would give a more efficient way to describe the length of a string x (l(x) = n). With the use of recursive functions

4 Kolmogorov stated that this allowed for a correct description of the quantity of hereditary information. He concluded that he intended to study the relationship between the necessary complexity of a program and its difficulty. The second paper published in 1968 titled Logical basis for Information Theory and Probability [12], Kolmogorov linked conditional entropy H(x y) to quantity of information required for computing. His definition of condition entropyh(x y) was interpreted as the minimal length of a recorded sequence of zeros and ones of a program (denoted by P ) that constructs the value (denoted by x) when the length of it (denoted by y) is known. The definition was given by H(x y) = min A(P,y)=x l(p ). He saw a need to take the conditional entropy H(x y) and mutual information I(x y) and attach a definite meaning to these concepts. He stated that he was following Solomonoffs work, since he was the first to support his idea. After developing the conditional entropy definition the next developments were slowed because he could not prove algebraically or derive the concepts of I(x y) = I(y x) and H(x, y) = H(x) + H(y x). He was able to able to derive some form of the concepts with approximate inequalities. The new definitions he gave were I(x y) I(y x) = O(log I(x, y)) and H(x, y) = H(x) + H(y x) + O(log I(x, y)) He went on to analyze long sequences x = (x 1, x 2,, x l ) with length l(x) consisting of zeros and ones to show that these sequences have entropy that is not smaller than their length, H(x) l(x). With his student Martin-L of he next looked at the symptom of randomness and approximated this to Bernoulli sequences. In this paper he also included Martin-L ofs work with m-bernoulli type sequences that have the joint entropy H[x l(x), k(x)]cl k m and H(x l l, k)cl kl m. He also stated that Bernoulli sequences have logarithmic increasing entropy H(x l ) = O(log l). These were the new basic information theory concepts that he found linking to probability theory and as well as introducing random sequences or sequences that do not have periodicity. He concluded the paper with a universal programming method defined by H A (x y) H A (x y) + C A and noted himself as well as Solomonoff for the accomplishments. There were several other papers he publish that built up to this idea [9] [8] [10]. B. Gregory Chaitin Gregory Chaitin was a mathematician and a computer scientist that contribute to Kolmogorov complexity. In 1966 he wrote a paper titled On the Length of Programs for Computing Finite Binary sequences [2]. Chaitin goes into a great amount of detail about the functionality of Turing machines. He describes Turing machines as a general purpose input/output computer as shown below. The general purpose computer takes the input tape and scans the tape for binary input. It subsequently produces a binary output on the tape. The Turing machine is defined to be N-state M-tape-symbol or N-row by M-column table. Chaitins interest was in finding the shortest program for calculating an arbitrary binary sequence. The study includes several factors such as the number of symbols a Turing machine can write, changing the logic design, removing redundancies, recursion, and transferring from one part of the program to another. He concludes that there are exactly ((N +1)(M +2)) NM possible programs for a N-state M-table-symbol Turing machine and this means that the machine has 4

5 5 Fig. 2. Turing Machine NM log 2 N bits of information. He also proposes a redesign of the machine to remove any redundancies. The result of the paper shows the mathematical outcomes for changing the factors above. Chaitins second contribution to Kolmogorov complexity was in 1974 with a paper titled Information Theoretic Limitations of Formal Systems [3]. In this paper Chaitin investigates mathematical applications of computational complexity. The goal of this paper was to examine the time and the number of bits of instructions to carry out finite or infinite amount of tasks. To be able to accomplish this he considered the size of the proofs and the tradeoffs between how much is assumed. After analyzing Turing machines and mathematic models for these machines Chaitin began to research randomness. In 1975 he wrote a paper titled, Randomness and Mathematical Proof. In this paper he established that for a non-random sequence (01 ten times) that a computer would execute to produce the output would be relatively small. For example: 1010 computer (1) It is obvious the input processed by the computer is smaller size than the output. However, for a random sequence with twenty random outcomes, this is closely modeled as twenty coin flips with the outcome of 220 binary series. Therefore, the program length to produce a random output sequence would be at least the length of the sequence itself. For example: computer (2) The input is the same size as the output. He also touched on other properties of randomness such as frequencies of binary digits as well as formal systems which were concluded to not be able to prove a number can be random unless the complexity of the number is less than the system itself. G dels theorem was used to prove this. Lastly, in 1977 Chaitin publish a paper titled, Algorithmic Information Theory [5], in which he attempted to show that Solmonoffs probability measure of programs and his and Kolmogorovs ideas about size of smallest programs were equivalent. Others also examined randmoness with mathematical proofs [4]. C. Ray Solomonoff Solmonoff is known as being one of the founders of Kolmogorov complexity. Solomonoff used induction to predict future events based on prior results. In 1964 he published a two part paper titled, A formal Theory of Inductive Inference [22][23]. For instance, if

6 6 taking a Turing machine that outputs some sequence of symbols from some length of input sequence, these output sequences could be assigned probabilities and predict the future events. This is a form of extrapolating data, specifically extrapolating of long sequences of symbols. Some of the mathematical models that he presented involved Bayes Theorem. Bayes Theorem of probability is based on conditional probability. One modern example of this is Bayesian SPAM filters. For instance, to predict if an incoming message is SPAM, previous information is used to assign a probability to the incoming message. If a user builds up an extensive list of known SPAM a Bayesian filter can use this to predict if the new incoming message based on the content of the message. Solomonoff develops mathematics to prove his induction/extrapolation methods, but at the time of his writing he also states that these findings are based on heuristics and it is likely that the findings (equations) may need to be corrected. Another method that he proposed in his paper is what he called inverse Huffman coding. When using Huffman coding, knowing the probability of strings allows a code to be constructed to code the strings minimally, so that the strings with highest probability are as short as possible. Inversion of Huffman code means that a minimal code is obtained for the string first, and using this code a probability of the string is found. In part two of his paper he applied his models to Bernoulli sequences, some Markov Chains, and phase structure grammars. D. Others There are two authors that provide basis for developing proofs and have been mentioned through the previous publications. One is William of Ochkam and is attributed to Occams Razor principle. This principle states that entities should not be multiplied beyond necessity [17]. This means that if there are multiple theories, chose the simplest theory which will make the confidence of a prediction higher. The next is Kurt G odel which is attributed to G odel s incompleteness theorem. This theorem used self-reference to prove statements within the system are true. This theorem is used in theory to determine what can be done and its limits. The fact that Kolmogorov complexity cannot be computed is due to G odel s theorem. IV. FURTHER DEVELOPMENTS After the ideas of Kolmogorov complexity were put forth. Students of Kolmogorov began further developing the ideas. Martin-Lofs work in 1966[15] was dedicated to defining random sequences as well as developing test for random sequences. This was presented in a paper titled, The Definition of Random Sequences. In this paper Martin-Lof first restates the complexity measure of Kolmogorov and follows this by definitions and proofs of universal test for randomness. He also develops the idea that all non-random sequences are defined as the null set. The remainder of the paper is dedicated to Bernoulli sequences. Zovkin and Levin[14] also studied Kolmogorovs work and developed a paper that surveyed the results of Kolmogorov, Solomonoff, Martin-Lof, and others. They developed more ideas about universal probability and its relationship to complexity. In the 1970s, Schnorr studied and published three papers [19] [20] [21] on random sequences. These papers state several definitions about finite and infinite random sequences. In one paper he stated that he formulates a thesis on the true concept of randomness. These studies are related to gambling. Thomas and Cover[6] explain this in a more comprehensive manner and call this universal gambling. They explain that if a gambler gambles on sequences consisting of 0 and 1 (x {0, 1}) without knowing the origin of the sequence there is a

7 7 way to associate his wealth with a betting scheme. The theorem that is proposed states that the length of the sequence (l(x)) is smaller than the wealth that a gambler achieves on a sequence plus the complexity of the sequence, formally: log S(x) + K(x) l(x). This is proven using Schnorr s papers. V. ADVANCEMENTS IN TOPIC Since the publishing of Kolmogorov complexity in the 1960s there have been several developments and applications in this area. One of the topics is called prefix-free complexity. Prefix-free languages are used to solve a specific problem determining where one string stops and another begins. Several papers have been written by Miller [16], Bienvenu, and Downey [1] to look at the upper bounds on plain and prefix-free Kolmogorov complexity. New applications of Kolmogorov complexity have also been developed. For example, Boris Rybko and Daniil Ryabko have published a paper titled, Using Kolmogorov Complexity for Understanding Some Limitations on Steganography [18]. Steganography is a way of writing hidden messages so that the sender and the recipient can convey messages, but others do not suspect that any message exists. They describe cover texts which can be a sequence of photographic images, videos, or text s that are exchanged between two parties over a public channel. The main idea is that the speed of transmission of secret information is proportional to the length of the cover text. Therefore, cover texts with maximal Kolmogorov complexity need to have a stegosystem with the same order of complexity and this system does not exist. They also define secure stegosystems that are simple and that have a high speed of transmitting secret information. The simplicity is determined by the equation exp(o(n)), when n is very large and approaches infinity. Highly complicated sources of data are modeled by Kolmogorov complexity. Kolmogorov complexity has also been used to study data mining because data mining is equivalent to data compression. The process of data mining is looking at and extracting patterns from large set of data. Faloutsos and Megalooikonomou have written a paper, On Data Mining, Compression, and Kolmogorov Complexity [7], in which they argue that data mining is an art and not a theory because the Kolmogorov complexity cannot be found. That is, data mining cannot be automated and many different approaches to it exist. Another very interesting and recent application of Kolmogorov complexity is detecting denial-of-service attacks. Kulkarni, Bush, and Evans wrote a paper on this topic titled, Detecting Distributed Denial-of-Service Attacks Using Komogorov Complexity Metrics [13]. A denial-of-service attack occurs when a source machine floods a target machine with typically ICMP or UDP protocol packets overloading the target with packets. In this case the details (bits) of the packet are of specific interest and can be analyzed based on fundamentals of information theory. In this paper complexity definitions for two random strings X and Y are given by: K(XY ) K(X) + K(Y ) + c. (3) This definition comes from the fundamental theorem of Kolmogorov complexity. In this theorem K(XY ) is the joint complexity of strings X and Y. The individual complexities of strings X and Y are given by K(X) and K(Y ). When the correlation between the strings increases the joint complexity decreases. When a high amount of traffic occurs the packets can be analyzed in terms of their destination, type, and execution pattern. In this paper the authors were able to show that by sampling packets a complexity differential over the samples can be made from the collected samples. This complexity can be used to determine

8 8 if a denial-of-service attack is occurring. The results in this paper showed that this type of method outperforms current methods of applying filters. VI. CONCLUSION In conclusion, Kolmogorov complexity studies the complexity of object and defines it to be the shortest binary program that defines the object. It is a subset of information theory relating to computer science. This idea was developed by Solomonoff, Kolmogorov, and Chatin. Each of these authors created the idea through inductive inference, use of Turing machines, random and non-random of objects (strings), and others. Although, Kolmogorov complexity cannot be solved it can be used as way of thinking and inductive reference for understanding computer science. It is a subset of information theory fundamental concepts such as entropy which is the expected length of the shortest binary computer description and others. This topic has many applications to physics, computer science, communication theory [17] and still an active area of research. REFERENCES [1] L. Bienvenu and R. Downey, Kolmogorov complexity and solovay functions, CoRR, vol. abs/ , [2] G. J. Chaitin, On the length of programs for computing binary sequences. Journal of the Association for Computing Machinery (ACM), vol. 13, pp , [3], Information theoretical limitations of formal systems. Journal of the Association for Computing Machinery (ACM), vol. 21, pp , [4], Randomness and mathematical proof. Scientific American, vol. 232, no. 5, pp , [5], Algorithmic information theory. IBM Journal of Research and Development, vol. 21, pp , [6] T. M. Cover and J. A. Thomas, Elements of Information Theory Second Edition. N. Y.: John Wiley & Sons, Inc., [7] C. Faloutsos and V. Megalooikonomou, On data mining, compression, and kolmogorov complexity, Data Mining and Knowledge Discovery, vol. Tenth Anniversary Issue, [8] A. N. Kolmogorov, On the shannon theory of information transmission in the case of continuous signals. IRE Transactions on Information Theory, vol. 2, pp , [9], Kolmogorov. a new metric invariant of transitive dynamical systems and automorphism in lebesgue spaces. Doklady Akademii Nauk SSSR, pp , [10], A new invariant for transitive dynamical systems. Doklady Akademii Nauk SSSR, vol. 119, pp , [11], Three approaches tothe quantitative definition of information. Problems of Information Transmission, vol. 1, pp. 4 7, [12], Logical basis for information theory and probability theory. IEEE Transactions on Information Theory, vol. 14, no. 5, pp , [13] A. Kulkarni and S. Bush, Detecting distributed denial-of-service attacks using kolmogorov complexity metrics, J. Netw. Syst. Manage., vol. 14, no. 1, pp , [14] L. A. Levin and A. K. Zvonkin, The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russ. Math. Surv., vol. 25, no. 6, pp , [15] P. Martin-Lof, The definition of random sequences. Inform. and Control, vol. 9, pp , [16] J. S. Miller, Contrasting plain and prefix-free kolmogorov complexity, In preparation. [17] P. V. Ming Li, An Introduction to Kolmogorov Complexity and Its Applications Second Edition. N. Y.: Springer, [18] B. Ryabko and D. Ryabko, Using kolmogorov complexity for understanding some limitations on steganography, CoRR, vol. abs/ , [19] C. P. Schnorr, A unified approach to the definition of random sequences. Math. Syst. Theory, vol. 5, pp , [20], Process, complexity and effective random tests. J. Comput. Syst. Sci.,, vol. 7, pp , [21], A surview on the theory of random sequences. R. Butts andj. Hinitikka (Eds.), Logic, Methodology and Philosophy of Science. Reidel,Dordrecht, The Netherlands, [22] R. J. Solomonoff, A formal theory of inductive inference. part i, Information and Control, vol. 7, no. 1, pp. 1 22, [23], A formal theory of inductive inference. part ii, Information and Control, vol. 7, no. 2, pp , 1964.

Algorithmic Probability

Algorithmic Probability Algorithmic Probability From Scholarpedia From Scholarpedia, the free peer-reviewed encyclopedia p.19046 Curator: Marcus Hutter, Australian National University Curator: Shane Legg, Dalle Molle Institute

More information

Kolmogorov Complexity

Kolmogorov Complexity Kolmogorov Complexity Davide Basilio Bartolini University of Illinois at Chicago Politecnico di Milano dbarto3@uic.edu davide.bartolini@mail.polimi.it 1 Abstract What follows is a survey of the field of

More information

CISC 876: Kolmogorov Complexity

CISC 876: Kolmogorov Complexity March 27, 2007 Outline 1 Introduction 2 Definition Incompressibility and Randomness 3 Prefix Complexity Resource-Bounded K-Complexity 4 Incompressibility Method Gödel s Incompleteness Theorem 5 Outline

More information

Symmetry of Information: A Closer Look

Symmetry of Information: A Closer Look Symmetry of Information: A Closer Look Marius Zimand. Department of Computer and Information Sciences, Towson University, Baltimore, MD, USA Abstract. Symmetry of information establishes a relation between

More information

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity Eckehard Olbrich MPI MiS Leipzig Potsdam WS 2007/08 Olbrich (Leipzig) 26.10.2007 1 / 18 Overview 1 Summary

More information

Introduction to Information Theory

Introduction to Information Theory Introduction to Information Theory Impressive slide presentations Radu Trîmbiţaş UBB October 2012 Radu Trîmbiţaş (UBB) Introduction to Information Theory October 2012 1 / 19 Transmission of information

More information

Intro to Information Theory

Intro to Information Theory Intro to Information Theory Math Circle February 11, 2018 1. Random variables Let us review discrete random variables and some notation. A random variable X takes value a A with probability P (a) 0. Here

More information

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Multimedia Communications. Mathematical Preliminaries for Lossless Compression Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when

More information

COS597D: Information Theory in Computer Science October 19, Lecture 10

COS597D: Information Theory in Computer Science October 19, Lecture 10 COS597D: Information Theory in Computer Science October 9, 20 Lecture 0 Lecturer: Mark Braverman Scribe: Andrej Risteski Kolmogorov Complexity In the previous lectures, we became acquainted with the concept

More information

A Mathematical Theory of Communication

A Mathematical Theory of Communication A Mathematical Theory of Communication Ben Eggers Abstract This paper defines information-theoretic entropy and proves some elementary results about it. Notably, we prove that given a few basic assumptions

More information

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course? Information Theory and Coding Techniques: Chapter 1.1 What is Information Theory? Why you should take this course? 1 What is Information Theory? Information Theory answers two fundamental questions in

More information

Algorithmic Information Theory

Algorithmic Information Theory Algorithmic Information Theory [ a brief non-technical guide to the field ] Marcus Hutter RSISE @ ANU and SML @ NICTA Canberra, ACT, 0200, Australia marcus@hutter1.net www.hutter1.net March 2007 Abstract

More information

Algorithmic probability, Part 1 of n. A presentation to the Maths Study Group at London South Bank University 09/09/2015

Algorithmic probability, Part 1 of n. A presentation to the Maths Study Group at London South Bank University 09/09/2015 Algorithmic probability, Part 1 of n A presentation to the Maths Study Group at London South Bank University 09/09/2015 Motivation Effective clustering the partitioning of a collection of objects such

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

Algorithmic Information Theory

Algorithmic Information Theory Algorithmic Information Theory Peter D. Grünwald CWI, P.O. Box 94079 NL-1090 GB Amsterdam, The Netherlands E-mail: pdg@cwi.nl Paul M.B. Vitányi CWI, P.O. Box 94079 NL-1090 GB Amsterdam The Netherlands

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Complexity 6: AIT. Outline. Dusko Pavlovic. Kolmogorov. Solomonoff. Chaitin: The number of wisdom RHUL Spring Complexity 6: AIT.

Complexity 6: AIT. Outline. Dusko Pavlovic. Kolmogorov. Solomonoff. Chaitin: The number of wisdom RHUL Spring Complexity 6: AIT. Outline Complexity Theory Part 6: did we achieve? Algorithmic information and logical depth : Algorithmic information : Algorithmic probability : The number of wisdom RHUL Spring 2012 : Logical depth Outline

More information

Kolmogorov complexity and its applications

Kolmogorov complexity and its applications CS860, Winter, 2010 Kolmogorov complexity and its applications Ming Li School of Computer Science University of Waterloo http://www.cs.uwaterloo.ca/~mli/cs860.html We live in an information society. Information

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Kolmogorov complexity ; induction, prediction and compression

Kolmogorov complexity ; induction, prediction and compression Kolmogorov complexity ; induction, prediction and compression Contents 1 Motivation for Kolmogorov complexity 1 2 Formal Definition 2 3 Trying to compute Kolmogorov complexity 3 4 Standard upper bounds

More information

Kolmogorov Complexity and Diophantine Approximation

Kolmogorov Complexity and Diophantine Approximation Kolmogorov Complexity and Diophantine Approximation Jan Reimann Institut für Informatik Universität Heidelberg Kolmogorov Complexity and Diophantine Approximation p. 1/24 Algorithmic Information Theory

More information

INDUCTIVE INFERENCE THEORY A UNIFIED APPROACH TO PROBLEMS IN PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE

INDUCTIVE INFERENCE THEORY A UNIFIED APPROACH TO PROBLEMS IN PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE INDUCTIVE INFERENCE THEORY A UNIFIED APPROACH TO PROBLEMS IN PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE Ray J. Solomonoff Visiting Professor, Computer Learning Research Center Royal Holloway, University

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

arxiv: v1 [cs.it] 17 Sep 2017

arxiv: v1 [cs.it] 17 Sep 2017 Kolmogorov Complexity and Information Content arxiv:1710.06846v1 [cs.it] 17 Sep 2017 Fouad B. Chedid Abstract In this paper, we revisit a central concept in Kolmogorov complexity in which one would equate

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2013, 29 November 2013 Memoryless Sources Arithmetic Coding Sources with Memory Markov Example 2 / 21 Encoding the output

More information

Is there an Elegant Universal Theory of Prediction?

Is there an Elegant Universal Theory of Prediction? Is there an Elegant Universal Theory of Prediction? Shane Legg Dalle Molle Institute for Artificial Intelligence Manno-Lugano Switzerland 17th International Conference on Algorithmic Learning Theory Is

More information

Randomness, probabilities and machines

Randomness, probabilities and machines 1/20 Randomness, probabilities and machines by George Barmpalias and David Dowe Chinese Academy of Sciences - Monash University CCR 2015, Heildeberg 2/20 Concrete examples of random numbers? Chaitin (1975)

More information

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

10-704: Information Processing and Learning Fall Lecture 10: Oct 3 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of

More information

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY 15-453 FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY KOLMOGOROV-CHAITIN (descriptive) COMPLEXITY TUESDAY, MAR 18 CAN WE QUANTIFY HOW MUCH INFORMATION IS IN A STRING? A = 01010101010101010101010101010101

More information

Introduction to Kolmogorov Complexity

Introduction to Kolmogorov Complexity Introduction to Kolmogorov Complexity Marcus Hutter Canberra, ACT, 0200, Australia http://www.hutter1.net/ ANU Marcus Hutter - 2 - Introduction to Kolmogorov Complexity Abstract In this talk I will give

More information

Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices

Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices Information 2010, 1, 3-12; doi:10.3390/info1010003 OPEN ACCESS information ISSN 2078-2489 www.mdpi.com/journal/information Article Using Information Theory to Study Efficiency and Capacity of Computers

More information

Kolmogorov complexity

Kolmogorov complexity Kolmogorov complexity In this section we study how we can define the amount of information in a bitstring. Consider the following strings: 00000000000000000000000000000000000 0000000000000000000000000000000000000000

More information

Recovery Based on Kolmogorov Complexity in Underdetermined Systems of Linear Equations

Recovery Based on Kolmogorov Complexity in Underdetermined Systems of Linear Equations Recovery Based on Kolmogorov Complexity in Underdetermined Systems of Linear Equations David Donoho Department of Statistics Stanford University Email: donoho@stanfordedu Hossein Kakavand, James Mammen

More information

Kolmogorov-Loveland Randomness and Stochasticity

Kolmogorov-Loveland Randomness and Stochasticity Kolmogorov-Loveland Randomness and Stochasticity Wolfgang Merkle 1 Joseph Miller 2 André Nies 3 Jan Reimann 1 Frank Stephan 4 1 Institut für Informatik, Universität Heidelberg 2 Department of Mathematics,

More information

which possibility holds in an ensemble with a given a priori probability p k log 2

which possibility holds in an ensemble with a given a priori probability p k log 2 ALGORITHMIC INFORMATION THEORY Encyclopedia of Statistical Sciences, Volume 1, Wiley, New York, 1982, pp. 38{41 The Shannon entropy* concept of classical information theory* [9] is an ensemble notion it

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Information and Entropy

Information and Entropy Information and Entropy Shannon s Separation Principle Source Coding Principles Entropy Variable Length Codes Huffman Codes Joint Sources Arithmetic Codes Adaptive Codes Thomas Wiegand: Digital Image Communication

More information

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Fabio Grazioso... July 3, 2006 1 2 Contents 1 Lecture 1, Entropy 4 1.1 Random variable...............................

More information

COMPUTATIONAL COMPLEXITY

COMPUTATIONAL COMPLEXITY ATHEATICS: CONCEPTS, AND FOUNDATIONS Vol. III - Computational Complexity - Osamu Watanabe COPUTATIONAL COPLEXITY Osamu Watanabe Tokyo Institute of Technology, Tokyo, Japan Keywords: {deterministic, randomized,

More information

(Classical) Information Theory II: Source coding

(Classical) Information Theory II: Source coding (Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable

More information

Sample Project: Simulation of Turing Machines by Machines with only Two Tape Symbols

Sample Project: Simulation of Turing Machines by Machines with only Two Tape Symbols Sample Project: Simulation of Turing Machines by Machines with only Two Tape Symbols The purpose of this document is to illustrate what a completed project should look like. I have chosen a problem that

More information

Information and Entropy. Professor Kevin Gold

Information and Entropy. Professor Kevin Gold Information and Entropy Professor Kevin Gold What s Information? Informally, when I communicate a message to you, that s information. Your grade is 100/100 Information can be encoded as a signal. Words

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

Program size complexity for possibly infinite computations

Program size complexity for possibly infinite computations Program size complexity for possibly infinite computations Verónica Becher Santiago Figueira André Nies Silvana Picchi Abstract We define a program size complexity function H as a variant of the prefix-free

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free

More information

Sophistication Revisited

Sophistication Revisited Sophistication Revisited Luís Antunes Lance Fortnow August 30, 007 Abstract Kolmogorov complexity measures the ammount of information in a string as the size of the shortest program that computes the string.

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Randomness and Mathematical Proof

Randomness and Mathematical Proof Randomness and Mathematical Proof Scientific American 232, o. 5 (May 1975), pp. 47-52 by Gregory J. Chaitin Although randomness can be precisely defined and can even be measured, a given number cannot

More information

Distribution of Environments in Formal Measures of Intelligence: Extended Version

Distribution of Environments in Formal Measures of Intelligence: Extended Version Distribution of Environments in Formal Measures of Intelligence: Extended Version Bill Hibbard December 2008 Abstract This paper shows that a constraint on universal Turing machines is necessary for Legg's

More information

Classification & Information Theory Lecture #8

Classification & Information Theory Lecture #8 Classification & Information Theory Lecture #8 Introduction to Natural Language Processing CMPSCI 585, Fall 2007 University of Massachusetts Amherst Andrew McCallum Today s Main Points Automatically categorizing

More information

Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences

Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences Technical Report IDSIA-07-01, 26. February 2001 Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences Marcus Hutter IDSIA, Galleria 2, CH-6928 Manno-Lugano, Switzerland marcus@idsia.ch

More information

Information similarity metrics in information security and forensics

Information similarity metrics in information security and forensics University of New Mexico UNM Digital Repository Electrical and Computer Engineering ETDs Engineering ETDs 2-9-2010 Information similarity metrics in information security and forensics Tu-Thach Quach Follow

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

Information Theory, Statistics, and Decision Trees

Information Theory, Statistics, and Decision Trees Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010

More information

Computer Sciences Department

Computer Sciences Department Computer Sciences Department 1 Reference Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Computer Sciences Department 3 ADVANCED TOPICS IN C O M P U T A B I L I T Y

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Information & Correlation

Information & Correlation Information & Correlation Jilles Vreeken 11 June 2014 (TADA) Questions of the day What is information? How can we measure correlation? and what do talking drums have to do with this? Bits and Pieces What

More information

Universal probability distributions, two-part codes, and their optimal precision

Universal probability distributions, two-part codes, and their optimal precision Universal probability distributions, two-part codes, and their optimal precision Contents 0 An important reminder 1 1 Universal probability distributions in theory 2 2 Universal probability distributions

More information

Entropy Rate of Stochastic Processes

Entropy Rate of Stochastic Processes Entropy Rate of Stochastic Processes Timo Mulder tmamulder@gmail.com Jorn Peters jornpeters@gmail.com February 8, 205 The entropy rate of independent and identically distributed events can on average be

More information

Measures of relative complexity

Measures of relative complexity Measures of relative complexity George Barmpalias Institute of Software Chinese Academy of Sciences and Visiting Fellow at the Isaac Newton Institute for the Mathematical Sciences Newton Institute, January

More information

Automata Theory. Definition. Computational Complexity Theory. Computability Theory

Automata Theory. Definition. Computational Complexity Theory. Computability Theory Outline THEORY OF COMPUTATION CS363, SJTU What is Theory of Computation? History of Computation Branches and Development Xiaofeng Gao Dept. of Computer Science Shanghai Jiao Tong University 2 The Essential

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code Chapter 2 Date Compression: Source Coding 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code 2.1 An Introduction to Source Coding Source coding can be seen as an efficient way

More information

Why is Deep Random suitable for cryptology

Why is Deep Random suitable for cryptology Why is Deep Random suitable for cryptology Thibault de Valroger (*) Abstract We present a new form of randomness, called «Deep Randomness», generated in such a way that probability distribution of the

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

Kolmogorov complexity and its applications

Kolmogorov complexity and its applications Spring, 2009 Kolmogorov complexity and its applications Paul Vitanyi Computer Science University of Amsterdam http://www.cwi.nl/~paulv/course-kc We live in an information society. Information science is

More information

2 Plain Kolmogorov Complexity

2 Plain Kolmogorov Complexity 2 Plain Kolmogorov Complexity In this section, we introduce plain Kolmogorov Complexity, prove the invariance theorem - that is, the complexity of a string does not depend crucially on the particular model

More information

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Please submit the solutions on Gradescope. Some definitions that may be useful: EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Definition 1: A sequence of random variables X

More information

CHAPTER 4 ENTROPY AND INFORMATION

CHAPTER 4 ENTROPY AND INFORMATION 4-1 CHAPTER 4 ENTROPY AND INFORMATION In the literature one finds the terms information theory and communication theory used interchangeably. As there seems to be no wellestablished convention for their

More information

A statistical mechanical interpretation of algorithmic information theory

A statistical mechanical interpretation of algorithmic information theory A statistical mechanical interpretation of algorithmic information theory Kohtaro Tadaki Research and Development Initiative, Chuo University 1 13 27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan. E-mail: tadaki@kc.chuo-u.ac.jp

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Production-rule complexity of recursive structures

Production-rule complexity of recursive structures Production-rule complexity of recursive structures Konstantin L Kouptsov New York University klk206@panix.com Complex recursive structures, such as fractals, are often described by sets of production rules,

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

An Algebraic Characterization of the Halting Probability

An Algebraic Characterization of the Halting Probability CDMTCS Research Report Series An Algebraic Characterization of the Halting Probability Gregory Chaitin IBM T. J. Watson Research Center, USA CDMTCS-305 April 2007 Centre for Discrete Mathematics and Theoretical

More information

A Computational Model of Time-Dilation

A Computational Model of Time-Dilation A Computational Model of Time-Dilation Charles Davi March 4, 2018 Abstract We propose a model of time-dilation using objective time that follows from the application of concepts from information theory

More information

Information in Biology

Information in Biology Information in Biology CRI - Centre de Recherches Interdisciplinaires, Paris May 2012 Information processing is an essential part of Life. Thinking about it in quantitative terms may is useful. 1 Living

More information

Johns Hopkins Math Tournament Proof Round: Automata

Johns Hopkins Math Tournament Proof Round: Automata Johns Hopkins Math Tournament 2018 Proof Round: Automata February 9, 2019 Problem Points Score 1 10 2 5 3 10 4 20 5 20 6 15 7 20 Total 100 Instructions The exam is worth 100 points; each part s point value

More information

Introduction to Languages and Computation

Introduction to Languages and Computation Introduction to Languages and Computation George Voutsadakis 1 1 Mathematics and Computer Science Lake Superior State University LSSU Math 400 George Voutsadakis (LSSU) Languages and Computation July 2014

More information

The simple ideal cipher system

The simple ideal cipher system The simple ideal cipher system Boris Ryabko February 19, 2001 1 Prof. and Head of Department of appl. math and cybernetics Siberian State University of Telecommunication and Computer Science Head of Laboratory

More information

Then RAND RAND(pspace), so (1.1) and (1.2) together immediately give the random oracle characterization BPP = fa j (8B 2 RAND) A 2 P(B)g: (1:3) Since

Then RAND RAND(pspace), so (1.1) and (1.2) together immediately give the random oracle characterization BPP = fa j (8B 2 RAND) A 2 P(B)g: (1:3) Since A Note on Independent Random Oracles Jack H. Lutz Department of Computer Science Iowa State University Ames, IA 50011 Abstract It is shown that P(A) \ P(B) = BPP holds for every A B. algorithmically random

More information

Randomness and Recursive Enumerability

Randomness and Recursive Enumerability Randomness and Recursive Enumerability Theodore A. Slaman University of California, Berkeley Berkeley, CA 94720-3840 USA slaman@math.berkeley.edu Abstract One recursively enumerable real α dominates another

More information

Compression Complexity

Compression Complexity Compression Complexity Stephen Fenner University of South Carolina Lance Fortnow Georgia Institute of Technology February 15, 2017 Abstract The Kolmogorov complexity of x, denoted C(x), is the length of

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Constructing Polar Codes Using Iterative Bit-Channel Upgrading. Arash Ghayoori. B.Sc., Isfahan University of Technology, 2011

Constructing Polar Codes Using Iterative Bit-Channel Upgrading. Arash Ghayoori. B.Sc., Isfahan University of Technology, 2011 Constructing Polar Codes Using Iterative Bit-Channel Upgrading by Arash Ghayoori B.Sc., Isfahan University of Technology, 011 A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree

More information

KOLMOGOROV COMPLEXITY AND ALGORITHMIC RANDOMNESS

KOLMOGOROV COMPLEXITY AND ALGORITHMIC RANDOMNESS KOLMOGOROV COMPLEXITY AND ALGORITHMIC RANDOMNESS HENRY STEINITZ Abstract. This paper aims to provide a minimal introduction to algorithmic randomness. In particular, we cover the equivalent 1-randomness

More information

Information Theory in Intelligent Decision Making

Information Theory in Intelligent Decision Making Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory

More information

COMM901 Source Coding and Compression. Quiz 1

COMM901 Source Coding and Compression. Quiz 1 German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding

More information

Chaitin Ω Numbers and Halting Problems

Chaitin Ω Numbers and Halting Problems Chaitin Ω Numbers and Halting Problems Kohtaro Tadaki Research and Development Initiative, Chuo University CREST, JST 1 13 27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan E-mail: tadaki@kc.chuo-u.ac.jp Abstract.

More information

Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory

Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory 1 The intuitive meaning of entropy Modern information theory was born in Shannon s 1948 paper A Mathematical Theory of

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Cryptographic Protocols Notes 2

Cryptographic Protocols Notes 2 ETH Zurich, Department of Computer Science SS 2018 Prof. Ueli Maurer Dr. Martin Hirt Chen-Da Liu Zhang Cryptographic Protocols Notes 2 Scribe: Sandro Coretti (modified by Chen-Da Liu Zhang) About the notes:

More information

Descriptive Complexity Approaches to Inductive Inference

Descriptive Complexity Approaches to Inductive Inference University of Pennsylvania ScholarlyCommons Technical Reports (CIS) Department of Computer & Information Science January 1991 Descriptive Complexity Approaches to Inductive Inference Kevin Atteson University

More information

Entropy as a measure of surprise

Entropy as a measure of surprise Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify

More information

Bioinformatics: Biology X

Bioinformatics: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA Model Building/Checking, Reverse Engineering, Causality Outline 1 Bayesian Interpretation of Probabilities 2 Where (or of what)

More information