Towards a Theory of Information Flow in the Finitary Process Soup

Size: px
Start display at page:

Download "Towards a Theory of Information Flow in the Finitary Process Soup"

Transcription

1 Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010

2 Goals Analyze model of evolutionary self-organization in terms of information flow. Measure channel capacity of elementary. Develop functional partitioning based on this and language properties.

3 Outline

4 ǫ-machines ǫ-machines Defined T = {S,T } S is a set of causal states T is the set of transitions between them: T (s) ij, s A ǫ-machine Properties All of their recurrent states form a single strongly connected component. Transitions are deterministic. S is minimal: an ǫ-machine is the smallest causal representation of the transformation it implements.

5 Transducers We interpret the symbols labeling the transitions in the alphabet A as consisting of two parts: an input symbol that determines which transition to take from a state and an output symbol which is emitted on taking that transition. Transducers implement functions: Character to character Input string to output string Map sets to sets (languages)

6 Set of Machines

7 Transducer Composition Create new mapping, input language of one machine becomes input language of another Not commutative Possible exponential growth in number of states.

8 Interaction Network Interaction Matrix G (k) G (k) ij = { 1 iftk = T j T i 0 otherwise

9 Transition Matrix T 1 T 2 T 3 T 0 T 1 T 2 T 3 T 0 T 1 T 2 T 3 T 0 T 1 T 2 T 3 T 0 T 0 T 0 T 1 T 1 T 1 T 1 T 2 T 2 T 2 T 2 T 3 T 3 T 3 T 3 T 1 T 2 T 3 T 1 T 1 T 3 T 3 T 2 T 3 T 2 T 3 T 3 T 3 T 3 T 3 T 4 T 8 T 12 T 0 T 4 T 8 T 12 T 0 T 4 T 8 T 12 T 0 T 4 T 8 T 12 T 5 T 10 T 15 T 0 T 5 T 10 T 15 T 0 T 5 T 10 T 15 T 0 T 5 T 10 T 15 T 4 T 8 T 12 T 1 T 5 T 9 T 13 T 2 T 6 T 10 T 14 T 3 T 7 T 11 T 15 T 5 T 10 T 15 T 1 T 5 T 11 T 15 T 2 T 7 T 10 T 15 T 3 T 7 T 11 T 15 T 0 T 0 T 0 T 4 T 4 T 4 T 4 T 8 T 8 T 8 T 8 T 12 T 12 T 12 T 12 T 1 T 2 T 3 T 4 T 5 T 6 T 7 T 8 T 9 T 10 T 11 T 12 T 13 T 14 T 15 T 0 T 0 T 0 T 5 T 5 T 5 T 5 T 10 T 10 T 10 T 10 T 15 T 15 T 15 T 15 T 1 T 2 T 3 T 5 T 5 T 7 T 7 T 10 T 11 T 10 T 11 T 15 T 15 T 15 T 15 T 4 T 8 T 12 T 4 T 4 T 12 T 12 T 8 T 12 T 8 T 12 T 12 T 12 T 12 T 12 T 5 T 10 T 15 T 4 T 5 T 14 T 15 T 8 T 13 T 10 T 15 T 12 T 13 T 14 T 15 T 4 T 8 T 12 T 5 T 5 T 13 T 13 T 10 T 14 T 10 T 14 T 15 T 15 T 15 T 15 T 5 T 10 T 15 T 5 T 5 T 15 T 15 T 10 T 15 T 10 T 15 T 15 T 15 T 15 T 15

10 Interaction Network Interaction Graph G Nodes correspond to ǫ-transducers If T C = T B T A, place directed edge connecting node T A, to T C, labeled with the transforming machine T B.

11 T_12 T_8 T_14 T_6 T_1 T_4 T_13 T_15 T_3 T_7 T_11 T_1 T_4 T_5 T_2 T_8T_9T_10T_11 T_15 T_3T_6T_7T_9T_11T_12T_13T_14T_15 T_2 T_8 T_10 T_3 T_1 T_1T_5T_9T_13 T_10 T_4 T_5 T_6 T_7 T_2 T_2 T_4 T_3 T_8 T_5 T_12 T_5 T_2 T_7 T_9T_13 T_6 T_14 T_3 T_1 T_7 T_11 T_15 T_1 T_3 T_5 T_6 T_7 T_11 T_14 T_9T_11 T_7 T_11 T_12 T_13 T_14 T_15 T_2 T_9 T_10 T_10 T_3 T_7 T_11 T_12 T_13 T_14 T_15 T_1 T_4 T_5 T_3 T_7 T_11 T_15 T_3T_6T_7T_9T_11T_12T_13T_14T_15 T_4 T_15 T_8 T_10 T_15 T_9 T_6 T_6 T_15 T_13 T_6 T_14 T_5 T_10 T_9 T_7 T_2 T_13 T_6 T_9T_13 T_11 T_14 T_6 T_7 T_14 T_9T_11 T_10 T_8 T_10 T_4 T_3 T_4 T_13 T_14 T_10 T_8T_9T_10T_11 T_15 T_12 T_1 T_1T_5 T_12 T_3T_6T_7T_9T_11T_12T_13T_14T_15 T_8 T_12 T_2 T_8 T_10 T_13 T_14 T_15 T_14 T_5 T_1T_5T_9T_13 T_6 T_7 T_4 T_2 T_6 T_10 T_4 T_5 T_2 T_12 T_12 T_4 T_6 T_10 T_14 T_5 T_3 T_7 T_11 T_15 T_1T_5T_9T_13 T_3 T_4 T_5 T_4 T_5 T_6 T_7 T_1 T_2 T_8 T_8T_9T_10T_11 T_8 T_2 T_12 T_13 T_14 T_15 T_8 T_10 T_1 T_2 Interaction Network G

12 Population Population P N individuals

13 Population A single replication is determined through compositions and replacements in a two-step sequence: 1 Construct ǫ-machine T C by forming the composition T C = T B T A from T A and T B randomly selected from the population and minimizing. 2 Replace a randomly selected ǫ-machine, T D, with T C. Note that there is no imposed notion of fitness nor spatial component.

14 Relaxation to Steady State ( of Size 100,000) 0.25 T p 0.15 T 3, T 5, T 10, T T 1, T 2, T 4, T T 7, T 11, T 13, T 14 T 6, T t/n

15 Communication Discrete [Cover and Thomas, 2006] (X,p(y x), Y): (X and Y) are finite sets p(y x) are probability mass functions

16 Cover and Thomas [Cover and Thomas, 2006] define the channel capacity of a discrete memoryless channel as: C = max I(X;Y ) (1) p(x) This capacity specifies the highest rate, in bits, at which information may be reliably transmitted through the channel, and Shannon s second theorem states that his rate is achievable in practice.

17 Mutual Remember that mutual information is the reduction in uncertainty in one random variable due to knowledge of another. I(X;Y ) = H(Y ) H(Y X) (2)

18 Discrete Noiseless Discrete Noiseless One to one correspondence between input and output symbols

19 Languages based on language L L in = Σ L out = Σ union of these possibility for positive channel capacity

20 Capacities 1 bits P(X)

21 Machines of Maximal 1 bits P(X)

22 Partially Noisy s 1 bits P(X)

23 Partially Noisy s 1 bits P(X)

24 Capacities Most are zero, except for... Machine Number P(X=1)

25 Meta-machines Meta-machine A set of machines that is closed and self-maintained under composition Ω P is a meta-machine if and only if (i) T i T j Ω, for all T i,t j Ω and (ii) For all T k Ω, there exists T i,t j Ω, such that T k = T i T j. Captures notion of invariant set: Ω = G Ω

26 T_12 T_8 T_14 T_6 T_1 T_4 T_13 T_15 T_3 T_7 T_11 T_1 T_4 T_5 T_2 T_8T_9T_10T_11 T_15 T_3T_6T_7T_9T_11T_12T_13T_14T_15 T_2 T_8 T_10 T_3 T_1 T_1T_5T_9T_13 T_10 T_4 T_5 T_6 T_7 T_2 T_2 T_4 T_3 T_8 T_5 T_12 T_5 T_2 T_7 T_9T_13 T_6 T_14 T_3 T_1 T_7 T_11 T_15 T_1 T_3 T_6 T_7 T_11 T_14 T_9T_11 T_7 T_11 T_12 T_13 T_14 T_15 T_2 T_9 T_10 T_10 T_3 T_7 T_11 T_12 T_13 T_14 T_15 T_1 T_4 T_5 T_3 T_7 T_11 T_15 T_3T_6T_7T_9T_11T_12T_13T_14T_15 T_5 T_4 T_15 T_8 T_10 T_15 T_9 T_6 T_6 T_15 T_13 T_6 T_14 T_5 T_10 T_9 T_7 T_2 T_13 T_6 T_9T_13 T_11 T_14T_6 T_7 T_14 T_9T_11 T_10 T_8 T_10 T_4 T_3 T_4 T_13 T_14 T_10 T_8T_9T_10T_11 T_15 T_12 T_1 T_1T_5 T_12 T_3T_6T_7T_9T_11T_12T_13T_14T_15 T_8 T_12 T_2 T_8 T_10 T_13 T_14 T_15 T_14 T_5 T_6 T_7 T_1T_5T_9T_13 T_4 T_2 T_6 T_10 T_4 T_5 T_2 T_12 T_12 T_4 T_6 T_10 T_14 T_5 T_1T_5T_9T_13 T_3 T_3 T_7 T_11 T_15 T_4 T_5 T_4 T_5 T_6 T_7 T_1 T_2 T_8 T_8T_9T_10T_11 T_8 T_2 T_12 T_13 T_14 T_15 T_8 T_10 T_1 T_2

27 T A T D T E T C T O T O T L T C T A (0.125) T A T E T T L T O T J T H B T E (0.125) TCTO TA TD T E T D T E T B T J T O (0.225) T O T L T J TH T J (0.125) T B T J TH T A T D T E T C T O T O T L T C T C (0.125) T T L T O T J T H B T A T E T D (0.069) T D T E T B T J T H (0.069) T J T H T O T L T C T J T H T J (0.069) T D T E T J T B T B (0.069) T H T J [Crutchfield, 2006]

28 Summary capacity is a function of machine structure. High channel capacity does not necessarily lead to persistence. All of the transducers in the single state meta-machine have zero channel capacity. Composition never increases channel capacity.

29 For Further Reading T. M. Cover and J. A. Thomas Elements of Theory Wiley-Interscience, 2006 J. P. Crutchfield and O. Görnerup Objects That Make Objects: The Population of Structural Complexity Journal of the Royal Society Interface, 3 (2006)

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS PROBABILITY AND INFORMATION THEORY Dr. Gjergji Kasneci Introduction to Information Retrieval WS 2012-13 1 Outline Intro Basics of probability and information theory Probability space Rules of probability

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Hierarchical Self-Organization in the Finitary Process Soup

Hierarchical Self-Organization in the Finitary Process Soup Hierarchical Self-Organization in the Finitary Process Soup Olof Görnerup 1,3, and James P. Crutchfield 2,3, 1 Department of Physical Resource Theory, Chalmers University of Technology, 412 96 Göteborg,

More information

arxiv: v1 [q-bio.pe] 28 Apr 2007

arxiv: v1 [q-bio.pe] 28 Apr 2007 Santa Fe Institute Working Paper 7-5-XXX arxiv.org e-print adap-org/75xxx Primordial Evolution in the Finitary Process Soup arxiv:74.3771v1 [q-bio.pe] 28 Apr 27 Olof Görnerup 1, and James P. Crutchfield

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

INTRODUCTION TO INFORMATION THEORY

INTRODUCTION TO INFORMATION THEORY INTRODUCTION TO INFORMATION THEORY KRISTOFFER P. NIMARK These notes introduce the machinery of information theory which is a eld within applied mathematics. The material can be found in most textbooks

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Information Theory and Statistics Lecture 2: Source coding

Information Theory and Statistics Lecture 2: Source coding Information Theory and Statistics Lecture 2: Source coding Łukasz Dębowski ldebowsk@ipipan.waw.pl Ph. D. Programme 2013/2014 Injections and codes Definition (injection) Function f is called an injection

More information

(Preprint of paper to appear in Proc Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov , 1990.)

(Preprint of paper to appear in Proc Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov , 1990.) (Preprint of paper to appear in Proc. 1990 Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, ov. 27-30, 1990.) CAUSALITY, FEEDBACK AD DIRECTED IFORMATIO James L. Massey Institute for Signal

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Information Theory in Intelligent Decision Making

Information Theory in Intelligent Decision Making Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

An Introduction to Information Theory: Notes

An Introduction to Information Theory: Notes An Introduction to Information Theory: Notes Jon Shlens jonshlens@ucsd.edu 03 February 003 Preliminaries. Goals. Define basic set-u of information theory. Derive why entroy is the measure of information

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Equivalence of Regular Expressions and FSMs

Equivalence of Regular Expressions and FSMs Equivalence of Regular Expressions and FSMs Greg Plaxton Theory in Programming Practice, Spring 2005 Department of Computer Science University of Texas at Austin Regular Language Recall that a language

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

CS256/Spring 2008 Lecture #11 Zohar Manna. Beyond Temporal Logics

CS256/Spring 2008 Lecture #11 Zohar Manna. Beyond Temporal Logics CS256/Spring 2008 Lecture #11 Zohar Manna Beyond Temporal Logics Temporal logic expresses properties of infinite sequences of states, but there are interesting properties that cannot be expressed, e.g.,

More information

Also, in recent years, Tsallis proposed another entropy measure which in the case of a discrete random variable is given by

Also, in recent years, Tsallis proposed another entropy measure which in the case of a discrete random variable is given by Gibbs-Shannon Entropy and Related Measures: Tsallis Entropy Garimella Rama Murthy, Associate Professor, IIIT---Hyderabad, Gachibowli, HYDERABAD-32, AP, INDIA ABSTRACT In this research paper, it is proved

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

5 Mutual Information and Channel Capacity

5 Mutual Information and Channel Capacity 5 Mutual Information and Channel Capacity In Section 2, we have seen the use of a quantity called entropy to measure the amount of randomness in a random variable. In this section, we introduce several

More information

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

Lecture 3: Nondeterministic Finite Automata

Lecture 3: Nondeterministic Finite Automata Lecture 3: Nondeterministic Finite Automata September 5, 206 CS 00 Theory of Computation As a recap of last lecture, recall that a deterministic finite automaton (DFA) consists of (Q, Σ, δ, q 0, F ) where

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

CS 154 Formal Languages and Computability Assignment #2 Solutions

CS 154 Formal Languages and Computability Assignment #2 Solutions CS 154 Formal Languages and Computability Assignment #2 Solutions Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak www.cs.sjsu.edu/~mak Assignment #2: Question 1

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

Relay Networks With Delays

Relay Networks With Delays Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur Module ntroduction to Digital Communications and nformation Theory Lesson 3 nformation Theoretic Approach to Digital Communications After reading this lesson, you will learn about Scope of nformation Theory

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

Entropy Rate of Stochastic Processes

Entropy Rate of Stochastic Processes Entropy Rate of Stochastic Processes Timo Mulder tmamulder@gmail.com Jorn Peters jornpeters@gmail.com February 8, 205 The entropy rate of independent and identically distributed events can on average be

More information

Bioinformatics: Biology X

Bioinformatics: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA Model Building/Checking, Reverse Engineering, Causality Outline 1 Bayesian Interpretation of Probabilities 2 Where (or of what)

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

Information in Biology

Information in Biology Information in Biology CRI - Centre de Recherches Interdisciplinaires, Paris May 2012 Information processing is an essential part of Life. Thinking about it in quantitative terms may is useful. 1 Living

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Computation Theory Finite Automata

Computation Theory Finite Automata Computation Theory Dept. of Computing ITT Dublin October 14, 2010 Computation Theory I 1 We would like a model that captures the general nature of computation Consider two simple problems: 2 Design a program

More information

Deterministic Finite Automata (DFAs)

Deterministic Finite Automata (DFAs) CS/ECE 374: Algorithms & Models of Computation, Fall 28 Deterministic Finite Automata (DFAs) Lecture 3 September 4, 28 Chandra Chekuri (UIUC) CS/ECE 374 Fall 28 / 33 Part I DFA Introduction Chandra Chekuri

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information 1 Conditional entropy Let (Ω, F, P) be a probability space, let X be a RV taking values in some finite set A. In this lecture

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

Deterministic Finite Automata (DFAs)

Deterministic Finite Automata (DFAs) Algorithms & Models of Computation CS/ECE 374, Fall 27 Deterministic Finite Automata (DFAs) Lecture 3 Tuesday, September 5, 27 Sariel Har-Peled (UIUC) CS374 Fall 27 / 36 Part I DFA Introduction Sariel

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

Complexity 6: AIT. Outline. Dusko Pavlovic. Kolmogorov. Solomonoff. Chaitin: The number of wisdom RHUL Spring Complexity 6: AIT.

Complexity 6: AIT. Outline. Dusko Pavlovic. Kolmogorov. Solomonoff. Chaitin: The number of wisdom RHUL Spring Complexity 6: AIT. Outline Complexity Theory Part 6: did we achieve? Algorithmic information and logical depth : Algorithmic information : Algorithmic probability : The number of wisdom RHUL Spring 2012 : Logical depth Outline

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y

More information

(Classical) Information Theory II: Source coding

(Classical) Information Theory II: Source coding (Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories

The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories Proceedings of the nd Central European Conference on Information and Intelligent Systems 8 The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories Robert Logozar

More information

Information in Biology

Information in Biology Lecture 3: Information in Biology Tsvi Tlusty, tsvi@unist.ac.kr Living information is carried by molecular channels Living systems I. Self-replicating information processors Environment II. III. Evolve

More information

Computational Systems Biology: Biology X

Computational Systems Biology: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA L#8:(November-08-2010) Cancer and Signals Outline 1 Bayesian Interpretation of Probabilities Information Theory Outline Bayesian

More information

We define the multi-step transition function T : S Σ S as follows. 1. For any s S, T (s,λ) = s. 2. For any s S, x Σ and a Σ,

We define the multi-step transition function T : S Σ S as follows. 1. For any s S, T (s,λ) = s. 2. For any s S, x Σ and a Σ, Distinguishability Recall A deterministic finite automaton is a five-tuple M = (S,Σ,T,s 0,F) where S is a finite set of states, Σ is an alphabet the input alphabet, T : S Σ S is the transition function,

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

A Mathematical Theory of Communication

A Mathematical Theory of Communication A Mathematical Theory of Communication Ben Eggers Abstract This paper defines information-theoretic entropy and proves some elementary results about it. Notably, we prove that given a few basic assumptions

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information