Information in Biology

Similar documents
Information in Biology

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 2: August 31

Quantitative Biology Lecture 3

6.02 Fall 2011 Lecture #9

Introduction to Information Theory

Lecture 1: Introduction, Entropy and ML estimation

3F1 Information Theory, Lecture 1

Entropies & Information Theory

Classification & Information Theory Lecture #8

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Murray Gell-Mann, The Quark and the Jaguar, 1995

Part I. Entropy. Information Theory and Networks. Section 1. Entropy: definitions. Lecture 5: Entropy

Thermodynamics Second Law Entropy

Machine Learning. Lecture 02.2: Basics of Information Theory. Nevin L. Zhang

Information Theory Primer:

COMPSCI 650 Applied Information Theory Jan 21, Lecture 2

Lecture 5 - Information theory

Lecture 27: Entropy and Information Prof. WAN, Xin

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Machine Learning Srihari. Information Theory. Sargur N. Srihari

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

How to Quantitate a Markov Chain? Stochostic project 1

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Information. = more information was provided by the outcome in #2

A Gentle Tutorial on Information Theory and Learning. Roni Rosenfeld. Carnegie Mellon University

CHAPTER 4 ENTROPY AND INFORMATION

CS 630 Basic Probability and Information Theory. Tim Campbell

Principles of Communications

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

3. If a choice is broken down into two successive choices, the original H should be the weighted sum of the individual values of H.

(Classical) Information Theory III: Noisy channel coding

Thermodynamics: More Entropy

Information Theory, Statistics, and Decision Trees

UNIT I INFORMATION THEORY. I k log 2

What is Entropy? Jeff Gill, 1 Entropy in Information Theory

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

6.1 Main properties of Shannon entropy. Let X be a random variable taking values x in some alphabet with probabilities.

Mutual Information & Genotype-Phenotype Association. Norman MacDonald January 31, 2011 CSCI 4181/6802

Information Theory in Intelligent Decision Making

Chapter I: Fundamental Information Theory

Biology as Information Dynamics

Lecture 1: Shannon s Theorem

Information Theory (Information Theory by J. V. Stone, 2015)

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

Quantitative Biology II Lecture 4: Variational Methods

Medical Imaging. Norbert Schuff, Ph.D. Center for Imaging of Neurodegenerative Diseases

6.02 Fall 2012 Lecture #1

Biology as Information Dynamics

5 Mutual Information and Channel Capacity

Thermodynamics: More Entropy

INTRODUCTION TO INFORMATION THEORY

Chapter 2 Review of Classical Information Theory

Noisy-Channel Coding

Lecture 7: DecisionTrees

Shannon's Theory of Communication

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Computing and Communications 2. Information Theory -Entropy

CSE468 Information Conflict

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Welcome to Comp 411! 2) Course Objectives. 1) Course Mechanics. 3) Information. I thought this course was called Computer Organization

Historical Perspectives John von Neumann ( )

Information Theory - Entropy. Figure 3

Thermodynamics of feedback controlled systems. Francisco J. Cao

Introduction to Information Theory

Massachusetts Institute of Technology

Introduction to Information Entropy Adapted from Papoulis (1991)

An Elementary Notion and Measurement of Entropy

Transmitting and Hiding Quantum Information

Chapter 9 Fundamental Limits in Information Theory

Fundamentals of Information Theory Lecture 1. Introduction. Prof. CHEN Jie Lab. 201, School of EIE BeiHang University

ECE 587 / STA 563: Lecture 2 Measures of Information Information Theory Duke University, Fall 2017

Basic information theory

Some Concepts in Probability and Information Theory

Topics. Probability Theory. Perfect Secrecy. Information Theory

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Hands-On Learning Theory Fall 2016, Lecture 3

Natural Image Statistics and Neural Representations

Lecture 22: Final Review

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

Some Basic Concepts of Probability and Information Theory: Pt. 2

Hidden Markov Models 1

Communication Theory and Engineering

Noisy channel communication

Information & Correlation

QB LECTURE #4: Motif Finding

1. Basics of Information

Intro to Information Theory

3F1 Information Theory, Lecture 3

Lectures on Medical Biophysics Department of Biophysics, Medical Faculty, Masaryk University in Brno. Biocybernetics

Massachusetts Institute of Technology

Beyond the Second Law of Thermodynamics

Information Theory and ID3 Algo.

An Introduction to Information Theory: Notes

Lecture 18: Quantum Information Theory and Holevo s Bound

DATA MINING LECTURE 9. Minimum Description Length Information Theory Co-Clustering

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Transcription:

Lecture 3: Information in Biology Tsvi Tlusty, tsvi@unist.ac.kr

Living information is carried by molecular channels Living systems I. Self-replicating information processors Environment II. III. Evolve collectively. Made of molecules. Generic properties of molecular channels subject to evolution? Information theory approach? Other biological information channels. 2

Outline Information in Biology Information in Biology Concept of information is found in many living systems: DNA, signaling, neuron, ribosomes, evolution. Goals: (1) Formalize and quantify biological information. (2) Application to various biological systems. (3) Looking for common principles. I. Information and Statistical Mechanics: Shannon s information theory and its relation to statistical mechanics. II. Overview: Living systems as information sources, channels and processors. III. Molecular information and noise. IV. Neural networks and coding theory. V. Population dynamics, social interaction and sensing. 3

I. Basics of Information Theory (Shannon) 4

Shannon s Information theory Information theory: a branch of applied math and electrical engineering. Developed by Claude Elwood Shannon. Main results: fundamental limits on signal processing such as, How well data can be compressed? What is the reliability of communicating signals? Numerous applications (besides communication eng.): Physics (stat mech), Math (statistical inference), linguistics, Computer science (cryptography, complexity), Economics (portfolio theory). The key quantity which measures information is entropy: Quantifies the uncertainty involved in predicting the value of a random variable (e.g., a coin flip or a die). What are the biological implications? 5

Claude Elwood Shannon (1916-2001) 1937 master's thesis: A Symbolic Analysis of Relay and Switching Circuits. 1940 Ph.D. thesis: An Algebra for Mandelian Genetics. WWII (Bell labs) works on cryptography and fire-control systems: Data Smoothing and Prediction in Fire-Control Systems. Communication theory of secrecy systems. 1948: Mathematical Theory of Communication. 1949: Sampling theory: Analog to digital. 1951: Prediction and Entropy of Printed English. 1950: Shannon s mouse: 1 st artificial learning machine. 1950: Programming a Computer for Playing Chess. 1960: 1 st wearable computer, Las Vegas. 6

A Mathematical Theory of Communication Shannon s seminal paper: "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379 423 (1948). Basic scheme of communication: Information source produces messages. Transmitter converts message to signal. Channel conveys the signal with Noise. Receiver transforms the signal back into the message. Destination: machine, person, organism receiving the message. Introduces information entropy measured in bits. 7

What is information? "The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point." Engineering perspective How to make good transmission channels Problem with telegraph lines, Define information as measure for the surprise If a binary channel transmits only 1 s there is no information (no surprise). If the channel transmits 0 s and 1 s with equal probability max. information. 8

Intuition: 20 questions game Try to guess the object from: {barrel, cat, dog, ball, fish, box, building}. First strategy: wild guess. 9

Intuition: 20 questions game Optimal strategy: equalized tree Information = # of yes/no questions in an optimal tree. 10

Introducing the bit If I have a (equal) choice between two alternatives the information is: I=1 bit = log 2 (#Alternatives) Harry Nyquist (1924): 1 bit = Certain Factors Affecting Telegraph Speed Example: How many bits are in a genome of length N? 11

Information from Shannon s axioms Shannon showed that the only function that obeys certain natural postulates is H p log p log p i (up to proportion constant). Example : if X is uniformly distributed over 128 outcomes 1 1 7 H p log p log log 2 7 bits i i 2 i 2 i i 2 i 7 2 7 2 i 2 2 Example: very uneven coin shows head only 1 in 1024 times 10 10 10 10 10 2 i 2 2 2 12 H p log p 2 log 2 (1 2 )log (1 2 ) (10 log e)2 bits i i

Entropy of a Binary Channel H plog p (1 p)log (1 p) 2 2 13

Why call it entropy? Shannon discussed this problem with John von Neumann: My greatest concern was what to call it. I thought of calling it information, but the word was overly used, so I decided to call it uncertainty. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. M. Tribus, E.C. McIrvine, Energy and information, Scientific American, 224 (1971). 14

Shannon Entropy and Statistical Mechanics 15

I. Maxwell s Demon Entropy in statistical mechanics: measure of uncertainty of a system after specifying its macroscopic observables such as temperature and pressure. Given macroscopic variables, entropy measures the degree of spreading probability over different possible states. Boltzmann famous formula: S ln 16

The second law of thermodynamics The second law of thermodynamics: In general, the total entropy of a system isolated from its environment, will tend not to decrease. Consequences: (i) heat will not flow from a colder body to a hotter body without work. (ii) No perpetuum mobile: One cannot produce net work from a single temperature reservoir (production of net work requires flow of heat from a hotter reservoir to a colder reservoir). 17

Maxwell s thought experiment How to violate the Second Law? - Container divided by an insulated wall. - Door can be opened and closed by a demon. - Demon opens the door to allow only "hot" molecules of gas to flow to the favored side. - One side heats up while other side cools down: decreasing entropy. Breaking 2 nd law! 18

Solution: entropy = -information Demon reduces the thermodynamic entropy of a system using information about individual molecules (their direction) Landauer (1961) showed that the demon must increase TD entropy by at least the amount of Shannon information he acquires and stores; total thermodynamic entropy does not decrease! min E k T ln 2 B Landauer s principle : Any logically irreversible manipulation of information, such as the erasure of a bit, must be accompanied by a corresponding entropy increase in the information processing apparatus or its environment. 19

Maximum entropy inference (E. T. Jaynes) Problem: Given partial knowledge e.g. the average value <X> of X how should we assign probabilities to outcomes P(X)? Answer: choose the probability distribution that maximizes the entropy (surprise) and is consistent with what we already know. Example: given energies E i and measurement <E> what is p i? Exercise: which dice X={1,2,3,4,5,6} gives <X>=3? 20

Maximum Entropy principle relates thermodynamics and information At equilibrium, TD entropy = Shannon information needed to define the microscopic state of the system, given its macroscopic description. Gain in entropy always means loss of information about this state. Equivalently, TD entropy = minimum number of yes/no questions needed to be answered in order to fully specify the microstate. 21

Additional slides

Conditional probability Consider two random variables X,Y with a joint probability distribution P(X,Y) The joint entropy is H(X,Y) The mutual entropy is H(X,Y)-H(X)-H(Y) The conditional entropies are H(X Y) and H(Y X) 23

Joint entropy H(X,Y) measures total entropy of the joint distribution P(X,Y) Mutual entropy I(X;Y) measures correlations ( P(x,y)=P(x)P(y) => I=0 ) Conditional entropy H(X Y) measures remaining uncertainty of X given Y 24

More information measures 25

Whose entropy? E. T. Jaynes 26

Kullback Leibler entropy Problem: Suppose a random variable X is distributed according to P(X) but we expect it to be distributed according to Q(X). What is the level of our surprise? Answer: The Kullback Leibler divergence pi DKL( p q) pi log2 q i i Mutual information I( X, Y ) D ( p( x, y) p( x) p( y)) KL Appears in many circumstances Example Markov chains 27

II. Second law in Markov chains Random walk on a graph:. W is transition matrix : p(t+1)=wp(t) p* be the steady state solution: Wp*=p* Theorem: distribution approaches steady-state t D(p p*) <=0 Also t D(p p*)=0 <=> p=p* In other words: Markov dynamics dissipates any initial information. 28