Draft of a lecture with exercises

Size: px
Start display at page:

Download "Draft of a lecture with exercises"

Transcription

1 Draft of a lecture with exercises Piotr Chołda March, 207 Introduction to the Course. Teachers: Lecture and exam: Piotr Chołda, Ph.D., Dr.habil. Room: 3, pav. D5 (first floor). Office hours: please contact me by . Phone: (67 ) piotr.cholda@agh.edu.pl. WWW: course webpage. Exercise classes and projects: Andrzej Kamisiński, M.Sc. andrzejk@agh.edu.pl. 2. How to get a credit (a positive final grade for the course): Exercise classes: with support of Matlab; three tests K-K3 (3 20 = 60 pts.); short quizzes (0 4 = 40 pts.); up to two correction (revision) tests are allowed provided some thresholds are met; activity during classes: scored with single points, no negative points; additional points can be obtained by solving exercises or performing tasks denoted with asterisk(s); exercise classes grade is calculated according to the AGH Bylaw 3.; precise rules related to the exercise classes are present at the course webpage in the document Rules related to the exercise classes, which is the integral part of the course rules. Project: implementation of a code or a procedure related to the information theory. The precise rules related to the project are present at the course webpage in the document Information on the project and the proposed topics, which is the integral part of the course rules. Page

2 Oral exam (contents of the lecture): the exam can be taken only by the students who have obtained a credit for the exercise classes and the project. The final grade is calculated as weighted arithmetic mean of the exercise classes (30%), project (30%) and the exam grades (40%), while the exam grade takes into account all failed terms. The grade is found according to the AGH Bylaw Simplified version of the exam: presentation of a lecture during the semester, exam: limited to the topic of the presented lecture, detailed conditions: topics and dates given at Google Docs, except for the topic, also the main problems to be covered and the suggested literature is given, required format: L A TEX, beamer class: the template is given at the course webpage, deadlines: it is necessary to provide the teacher with the slides two weeks prior to the scheduled presentation, the suggestions given by the teacher with respect to the initial version of the presentation should be taken into account, presence during the lectures: at least 50% (i.e., 7 lectures). 3. Presence/absence: If at least two students attend the lectures, the lectures are not obligatory. Starting from the lecture, where the attendance is smaller than two students, the lectures become obligatory and the presence is taken into account (no more than 40% of absences is allowed). Due to the AGH Bylaw.3, presence at exercise classes is obligatory. 4. There are no presentations available only lecture drafts are given before a lecture (at WWW, sometimes modifications can be introduced after a lecture). 2 Bibliography In case you have problems with finding the books, you can borrow them from the teacher. Dominic Welsh. Codes and Cryptography. Clarendon Press, Oxford, UK, 988. Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. John Wiley & Sons, Inc., New York, NY, 99. Gareth A. Jones and J. Mary Jones. Information and Coding Theory. Springer-Verlag London Ltd., London, UK, Page 2

3 Todd K. Moon. Error Correction Coding. John Wiley & Sons, Inc., Hoboken, NJ, Stefan M. Moser and Po-Ning Chen. A Student s Guide to Coding and Information Theory. Cambridge University Press, Cambridge, UK, Lecture I (March 7, 207): entropy, channels, capacity a primer to the information theory 3. Sources of information and entropy. Digital communication system: 2. Noise: open, closed. external (thunders, radiation), internal (thermal/johnson, shot noise). 3. Message sources: set of messages: discrete, continuous, distributions: deterministic, random (probabilistic, stochastic), 4. Information capacity/denstity (of a message): if message x i is generated with probability Pr{x i } = p, its information capacity (zawartość informacyjna) is expressed as: I(x i ) = I(p) = log r p = log r p. Typical base for our course is r = Entropy of a random variable (source) X with a discrete probability distriburion Pr{X = x i } = Pr{x i } = p i ( p i = ) can be characterized with the following entropy (entropia zmiennej losowej ): H(X) = i H(p, p 2,... ) = p i lg p i [bit(s)]. Entropy is the average value of i information capacity for the random variable (information source), E[I]. The more random data, the larger entropy. 6. For a discrete probability distribution of random variable X, with the cardinality of the realization set equal to N (we can identify this distribution with the probability distribution of message generation for a particular information source): Limit for non-zero probabilities lim p 0 p lg p = 0. Upper and lower bounds 0 H(X) lg N. Page 3

4 Equiprobable distribution For probability distribution Pr{X = x i } = N, i N entropy reaches its maximum for all N-element distributions: H max (X) = H( N, N,..., N ) = lg N. 7. L-th extension of a source: consider a discrete source X, generating N possible messages x, x 2,..., x N. Now, let us consider a sequence of messages of source X of length L: s i = x i() x i(2)... x i(l). A discrete source generating sequences s i is called the L-th extension of a source (L-krotne rozszerzenie źródła). If X L is the L-th extension of a memoryless source X, then: H(X L ) = L H(X). 8. Joint/mutual entropy: for two random variables X, Y with joint probability distribution Pr{X = x i, Y = y j } = Pr{x i, y j } = p(i, j) = p ij, p(i, j) = ) joint entropy (entropia łączna) is defined as follows: i j H(X, Y ) = p(i, j) lg p(i, j). i j 9. Markov source (źródło marko[wo]wskie) of order L is a source generating N messages x,..., x N, that is described by probabilities Pr {s i } and conditional probabilities Pr {x j s i }, where s i is a message sequence genereted prior to x j : s i = x i()... x i(l). The entropy of such Markov source is defined as: H L (X) = = N N L Pr{x j, s i } lg Pr{x j s i } j= i= N N j= i = N Pr{x j, x i,..., x il } lg Pr{x j x i,..., x il }. i L = We must not confuse source extensions (memoryless) and Markov sources (that always have a memory)! 0. Conditional entropy: for joint random variables X i Y conditional entropy H(Y X) (entropia warunkowa) is defined as the average value of entropy of random variable Y given the occurence of X: H(Y X) = Pr{x i } Pr{y j x i } lg Pr{y j x i }. i j. Mutual information (transinformation) (informacja wzajemna, transinformacja) is defined as follows: I(X; Y ) = M N Pr{x i, y j } lg i= j= Pr{x i,y j} Pr{x i} Pr{y j}. Mutual information quantifies the reduction in uncertainty due to another random variable. We use a semicolon I(X; Y ), not a comma (sometimes used notion I(X, Y ) denotes the joint entropy!). There is not a minus in the definition formula!. 3.2 Channels. Channels can be: discrete vs. continuous, Page 4

5 x = 0 y = x y 3 = E x 2 = y 2 = x 2 Figure : Binary erasure channel. binary, memoryless, stationary, symmetric. 2. Discrete memoryless channel is defined with: a set of input messages: X = {x, x 2,..., x M }; a set of output messages: Y = {y, y 2,..., y N }; a set of M N transition probabilities: Pr{y j x i } (represented by P N M matrix). 3. Noiseless channel (purely theoretical construction): Pr{y k x k = y k } =. 4. Binary erasure channel (Figure ). 5. Binary symmetric channel (BSC, binarny kanał symetryczny): Pr{0 } BSC = Pr{ 0} BSC = BER (Bit Error Rate). Due to simplified modeling, we eagerly use the memoryless channels, but this modeling is not exact for large bitrates (where burst errors take place). 6. Information channel is a very broad notion: transmission channel (radio, fibre, etc.), storage device, packed file Capacity. We are interested in the amount of information that can be transferred via a transmission channel (defined by X, Y, Pr{y j x i }). Channel capacity (przepustowość kanału) for a discrete memoryless channel is defined as maximum transinformation I(X; Y ) that can be carried by the channel at once, while optimization goes through all possible probability distributions of the input messages: C = max P X max [H(Y ) H(Y X)]. P X 3.4 Exercises I(X; Y ) = max P X [H(X) H(X Y )] = The best way to deal with the exercises (it is only a piece of advice, this is not a homework): before the classes, look through all the exercises; Page 5

6 try to solve at least one exercise from each group; during the classes declare the exercises you recognize as very difficult to solve (we will try to solve them) typically, during the classes the students decide which exercises are to be solved; solve all the exercises after the classes (if there are some issues not covered during the classes, you can ask the teacher during his office hours).. Entropy calculation: (a) Which of the following sequences carries more information: i. 0 letters of the Latin alphabet (it is assumed that each of 32 letter generation is equally possible, we do not care about uppercase or lowercase letters), ii. 32 numerical digits (it is assumed that generation of each of ten digits is equally possible)? (b) What can be the maximum uncertainty related to the 6 9 cm 2 photography, if a size of a single pixel equals cm 2, and each pixel can be characterized with one of the three values: white, gray and black? (c) A colleague tosses a coin three times and informs us how many heads (or tails) were obtained. What is the uncertainty about the result of the first toss? 2. Complex entropy calculations: (a) Information source generates messages according to the following scheme: in each time step, 0 or is randomly chosen (with probability p 0 = p and p = p, respectively). Such a drawing finishes when the first is picked. The source sends the information in which step this has happened. Find entropy of the source. (b) Source Z is defined as follows: () we have two sources X and Y with known entropies: H(X) and H(Y ), respectively; (2) messages generated at every moment by these two sources are different (e.g., X generates letters, and Y generates digits); (3) at every moment, source Z selects with known probability p to send a message just generated by X (equivalently, we can also say that it selects with probability p to send a message just generated by Y ). Find entropy of Z. (c) The inhabitants of a certain village are divided into two groups A and B. Half the people in group A always tell the truth, three tenths always lie, and two tenths always refuse to answer. In group B, threetenths of the people are truthful, half are liars, and two tenths always refuse to answer. Let p be the probability that a person selected at random will belong to group A. Let I(p) be the information conveyed about a person s truth-telling status by specifying his group membership. Find the maximum possible value of I and the percentage of people in group A for which the maximum occurs. 3. Properties of various entropy measures: Page 6

7 (a) Show that for independent random variables X, Y the following relationship holds for mutual entropy: H(X, Y ) = H(X) + H(Y ). (b) Prove that the following property takes place (the so-called chain rule): H(X, Y ) = H(X) + H(Y X). (c) Show that for transinformation, the following relations are the truth: I(X; Y ) = I(Y ; X) = H(X) H(X Y ) = H(Y ) H(Y X) = H(X) + H(Y ) H(X, Y ). (d) Random variable X can be equiprobably realized as an integer value from the set, 2, 3,..., 2N. On the other hand, random variable Y is defined as equal to 0, when the value of X is even, while Y =, when the value of X is odd. Prove that in such a case the following relationship is true for conditional entropy: H(X Y ) = H(X). Then show that the following relation is also the truth: H(Y X) = 0. (e) Messages of sources W and F are emitted in a mutually independent way. We know that the entropy of W is equal to eight bits and F twelve bits. Plot the relationship H(W F) = f (H(F W)), i.e, H(W F) as a function of variable H(F W) (from its minimum to its maximum value). (f) X a discrete random variable is given. Another random variable is defined by the following function: Y = g(x). Which of the following is the general relationship between H(Y ) and H(X): H(Y ) H(X), H(Y ) H(X)? What are the conditions related to g( ) to have the equality? 4. Sources with memory: (a) We are observing a source emitting two messages: $ and. We note that the probabilities of generation of two-message blocks are given in Table. Check if this source is memoryless. (b) A source generates two messages: 0 and. Calculate entropy understood in the classical way and entropy of a Markov source for two cases: Page 7

8 Table : Frequencies of the two message blocks occurence for exercise 4a $$ $ $ 2/5 /0 /0 2/5 p p N p N x 0 x x i x N p N Figure 2: A sample Markov source for exercise 4d. i. messages are equiprobable, there is a source with memory, and: Pr{0 } = Pr{ 0} = 3 ; ii. there is a source with memory, but for this case: Pr{0} = 3 4, Pr{0 0} = 2, Pr{0 } =. 3 (c) Draw a diagram of a Markov source of the 2 nd order, if the source is binary and conditional probabilities are given below: Pr{0 00} = Pr{ } = 0.8; Pr{ 00} = Pr{0 } = 0.2; Pr{0 0} = Pr{0 0} = Pr{ 0} = Pr{ 0} = 0.5. Then, find entropy of this source if we know that: Pr{00} = Pr{} = 5 2, Pr{0} = Pr{0} = 4 4. (d) With the help of the state diagram, Figure 2 shows S, a Markov source of the first order. Lack of arrow indicated that the related conditional probability equals zero. If a probability is not given next to an arrow, it means that this probability can be inferred somehow. For x, x 2,..., x N all transitions are analogous to x i. Sketch plots of H(S) and H (S) as functions of p. 5. Channels: (a) A discrete memoryless binary channel along with probabilities of the input and output messages is given in Figure??. Find the range of values of the joint probability of having simultaneously $ at the input and at the output. Page 8

9 3 $ a b 3 4 (b) Check if a cascade of binary symmetric channels (they are connected serially: an output of one is connected to an input of the next one; the channels do not necessarily have the same probabilistic characteristics) is also a binary symmetric channel. (c) We have a number of (not necessarily identical) binary symmetric channels characterized with the probabilities of error, given as: 2 p i <. How can we, using (not necessarily all of) them, construct a binary symmetric channel characterized with: 0 < BER 2? (d) Information consisting of N binary symbols is transmitter through binary symmetric channel with BER equal to p. What is the expected value of the number of bits transmitted without errors in this information? (e) A binary channel is represented with the following matrix: [ ] a b, a b where at the input and the output we have symbols from set {0, }; and a b. The receiver (taking signals from the channel) is given symbols 0 and with the same frequency. Find the probability distribution at the input of this channel (at the output of the transmitter) and show that the entropy at the channel output is larger than the one at its input. (f) Find the mutual information I(X; Y ) for a channel with two inputs x, x 2 X and three outputs y, y 2, y 3 Y given in Figure 3 if Pr{x } = Pr{x 2 } = Capacity calculation: (a) A binary channel is characterized with the following transition probabilities: p(y x ) =, p(y x 2 ) = p(y 2 x 2 ) = 2. We observe that one is present at the output of the channel thirteen times more than zero. Find capacity of this channel. Page 9

10 x = 0 x 2 = y = 0 y 3 = E y 2 = Figure 3: Example memoryless channel for exercise 5f. x = 0 y = x y 3 = E x 2 = y 2 = x 2 Figure 4: Binary erasure channel for exercise 6b. (b) A binary erasure channel (given in Figure 4) is characterized with the following probabilities: α β α α α β. β β Show that the capacity of this channel is given by the following formula: C = ( β)[ lg( β)] + ( α β) lg( α β) + α lg α. (c) Find the channel capacity for a binary erasure channel shown in Figure 5. (d) Find the capacity of the channel given in Figure 6 with the transition probabilities p(y j x i ) = 2 : p(y x 2 ) = p(y x 5 ) = p(y 2 x ) = p(y 2 x 3 ) = p(y 3 x 2 ) = p(y 3 x 4 ) = p(y 4 x 3 ) = p(y 4 x 5 ) = p(y 5 x 4 ) = p(y 5 x ) = 2 (the channel is symmetric). 0 p p 2 p p 3 Figure 5: Binary erasure channel for exercise 6c Page 0

11 x y x 2 y 2 x 3 y 3 x 4 y 4 x 5 y 5 Figure 6: Exemplary symmetric channel for exercise 6d Figure 7: Simplified keypad for exercise 6f. (e) Let us observe the M-ary input, M-ary output, and symmetric channel for which Pr{y j x i } = Ps M (if i j) and 0 < P s <. Find this channel capacity. (f) A simplified numerical keypad is given in Figure 7. It has four keys (in two rows, each row containts two keys). A user planning to push key x, will push another key in the same row with probability α or another key in the same column with α. That means, that the user with push the intended key x with probability 2α. Such a keypad can be described as a transmission channel with the input being the user s intention and the output the actually pushed key. Find the matrix describing this channel and its capacity. (g) N identical binary symmetric channels, each with bit error rate 0 < p <, is connected in cascade. Show that the capacity of such a cascade is equal to: C N = + p N lg p N + ( p N ) lg ( p N ), [ gdzie p N = ] 2 + ( 2p) N. (h) A weather forecast output for some town is given in Table 2. A man noted that the forecast is accurate only in 3 /4 cases (show why). He also noted that he would be right in 3 /6 of all cases, if he always foresees that it will be sunny. He suggested to a local radio station that it is better to always say in a weather forecast that it will be sunny. They rejected his proposal, using the reasoning based on the information theory concept of a channel capacity. Reconstruct this reasoning. 7. Various (*): (a) On the basis of the following property of the logarithm: { x if x = ln x = y < x if x Page

12 Table 2: Example weather forecast results for exercise 6h Reality Forecast Rains Sunny Rains /8 3/6 Sunny /6 0/6 prove the following theorem: When a source generates N messages (,..., N), each of which is generated with the same probability p i, then entropy of this source can be characterized with the following property: H(X) = H(p, p 2,..., p N ) lg N, The equality holds iff i p i = N. (b) Show that information source X which generates messages x 0, x 2,..., x N following the binomial distribution (Bernoulli s distribution), i.e., for 0 i N, 0 < p < and q = p: ( ) N Pr{X = x i } = p i q N i, i can be characterized with the following entropy: H(X) N(p log 2 p + q log 2 q). A few useful relationships are going to be given in the following form in a test sheet: H(X, Y ) = H(X) + H(Y X) (chain rule) I(X; Y ) = H(X) H(X Y ) = H(Y ) H(Y X) = H(X) + H(Y ) H(X, Y ) X, Y independent: H(X, Y ) = H(X) + H(Y ) lg Bibliography The contents of this lecture is based on the following books: Dominic Welsh. Codes and Cryptography. Clarendon Press, Oxford, UK, 988: chapters, 2., , 3.4, , 6, appendices -2. Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. John Wiley & Sons, Inc., New York, NY, 99: chapters 2, 4, 8, 6. Gareth A. Jones and J. Mary Jones. Information and Coding Theory. Springer-Verlag London Ltd., London, UK, 2000: chapters 2.6, , , , Todd K. Moon. Error Correction Coding. John Wiley & Sons, Inc., Hoboken, NJ, 2005: chapters.-.3, Stefan M. Moser and Po-Ning Chen. A Student s Guide to Coding and Information Theory. Cambridge University Press, Cambridge, UK, 202: chapters.-.3, , 5.8, 6. Page 2

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

3F1 Information Theory, Lecture 1

3F1 Information Theory, Lecture 1 3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

INTRODUCTION TO INFORMATION THEORY

INTRODUCTION TO INFORMATION THEORY INTRODUCTION TO INFORMATION THEORY KRISTOFFER P. NIMARK These notes introduce the machinery of information theory which is a eld within applied mathematics. The material can be found in most textbooks

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019 Lecture 11: Information theory DANIEL WELLER THURSDAY, FEBRUARY 21, 2019 Agenda Information and probability Entropy and coding Mutual information and capacity Both images contain the same fraction of black

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 11: Information Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 15 th, 2015 1 Midterm Degrees Exam Type of Assignment Max. Final Oral* Oral

More information

Introductory Probability

Introductory Probability Introductory Probability Discrete Probability Distributions Dr. Nguyen nicholas.nguyen@uky.edu Department of Mathematics UK January 9, 2019 Agenda Syllabi and Course Websites Class Information Random Variables

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Entropy Rate of Stochastic Processes

Entropy Rate of Stochastic Processes Entropy Rate of Stochastic Processes Timo Mulder tmamulder@gmail.com Jorn Peters jornpeters@gmail.com February 8, 205 The entropy rate of independent and identically distributed events can on average be

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39 Entropy Probability and Computing Presentation 22 Probability and Computing Presentation 22 Entropy 1/39 Introduction Why randomness and information are related? An event that is almost certain to occur

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

transmission and coding of information problem list José Luis Ruiz july 2018

transmission and coding of information problem list José Luis Ruiz july 2018 transmission and coding of information problem list José Luis Ruiz july 2018 Departament de Matemàtiques Facultat d Informàtica de Barcelona Universitat Politècnica de Catalunya c 2010 2018 Contents 1

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Bioinformatics: Biology X

Bioinformatics: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA Model Building/Checking, Reverse Engineering, Causality Outline 1 Bayesian Interpretation of Probabilities 2 Where (or of what)

More information

Shannon s A Mathematical Theory of Communication

Shannon s A Mathematical Theory of Communication Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the

More information

Lecture 11: Continuous-valued signals and differential entropy

Lecture 11: Continuous-valued signals and differential entropy Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Stat 609: Mathematical Statistics I (Fall Semester, 2016) Introduction

Stat 609: Mathematical Statistics I (Fall Semester, 2016) Introduction Stat 609: Mathematical Statistics I (Fall Semester, 2016) Introduction Course information Instructor Professor Jun Shao TA Mr. Han Chen Office 1235A MSC 1335 MSC Phone 608-262-7938 608-263-5948 Email shao@stat.wisc.edu

More information

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

6.02 Fall 2011 Lecture #9

6.02 Fall 2011 Lecture #9 6.02 Fall 2011 Lecture #9 Claude E. Shannon Mutual information Channel capacity Transmission at rates up to channel capacity, and with asymptotically zero error 6.02 Fall 2011 Lecture 9, Slide #1 First

More information

Introduction to Stochastic Processes

Introduction to Stochastic Processes Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.

More information

Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory

Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory 1 The intuitive meaning of entropy Modern information theory was born in Shannon s 1948 paper A Mathematical Theory of

More information

EECS 229A Spring 2007 * * (a) By stationarity and the chain rule for entropy, we have

EECS 229A Spring 2007 * * (a) By stationarity and the chain rule for entropy, we have EECS 229A Spring 2007 * * Solutions to Homework 3 1. Problem 4.11 on pg. 93 of the text. Stationary processes (a) By stationarity and the chain rule for entropy, we have H(X 0 ) + H(X n X 0 ) = H(X 0,

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Math 180B Problem Set 3

Math 180B Problem Set 3 Math 180B Problem Set 3 Problem 1. (Exercise 3.1.2) Solution. By the definition of conditional probabilities we have Pr{X 2 = 1, X 3 = 1 X 1 = 0} = Pr{X 3 = 1 X 2 = 1, X 1 = 0} Pr{X 2 = 1 X 1 = 0} = P

More information

Markov Processes Hamid R. Rabiee

Markov Processes Hamid R. Rabiee Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Machine Learning, Fall 2009: Midterm

Machine Learning, Fall 2009: Midterm 10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all

More information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information 1 Conditional entropy Let (Ω, F, P) be a probability space, let X be a RV taking values in some finite set A. In this lecture

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Fabio Grazioso... July 3, 2006 1 2 Contents 1 Lecture 1, Entropy 4 1.1 Random variable...............................

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

Basic information theory

Basic information theory Basic information theory Communication system performance is limited by Available signal power Background noise Bandwidth limits. Can we postulate an ideal system based on physical principles, against

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

Shannon's Theory of Communication

Shannon's Theory of Communication Shannon's Theory of Communication An operational introduction 5 September 2014, Introduction to Information Systems Giovanni Sileno g.sileno@uva.nl Leibniz Center for Law University of Amsterdam Fundamental

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding Ch 0 Introduction 0.1 Overview of Information Theory and Coding Overview The information theory was founded by Shannon in 1948. This theory is for transmission (communication system) or recording (storage

More information

Computational Systems Biology: Biology X

Computational Systems Biology: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA L#8:(November-08-2010) Cancer and Signals Outline 1 Bayesian Interpretation of Probabilities Information Theory Outline Bayesian

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7 ECS 332: Principles of Communications 2012/1 HW 4 Due: Sep 7 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will

More information

Distributed Source Coding Using LDPC Codes

Distributed Source Coding Using LDPC Codes Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Lecture 1: Shannon s Theorem

Lecture 1: Shannon s Theorem Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work

More information

Uncertainity, Information, and Entropy

Uncertainity, Information, and Entropy Uncertainity, Information, and Entropy Probabilistic experiment involves the observation of the output emitted by a discrete source during every unit of time. The source output is modeled as a discrete

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Today s Outline. Biostatistics Statistical Inference Lecture 01 Introduction to BIOSTAT602 Principles of Data Reduction

Today s Outline. Biostatistics Statistical Inference Lecture 01 Introduction to BIOSTAT602 Principles of Data Reduction Today s Outline Biostatistics 602 - Statistical Inference Lecture 01 Introduction to Principles of Hyun Min Kang Course Overview of January 10th, 2013 Hyun Min Kang Biostatistics 602 - Lecture 01 January

More information

Channel Coding: Zero-error case

Channel Coding: Zero-error case Channel Coding: Zero-error case Information & Communication Sander Bet & Ismani Nieuweboer February 05 Preface We would like to thank Christian Schaffner for guiding us in the right direction with our

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

God doesn t play dice. - Albert Einstein

God doesn t play dice. - Albert Einstein ECE 450 Lecture 1 God doesn t play dice. - Albert Einstein As far as the laws of mathematics refer to reality, they are not certain; as far as they are certain, they do not refer to reality. Lecture Overview

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Multivariate Statistical Analysis

Multivariate Statistical Analysis Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Syllabus and Lecture 1 for Applied Multivariate Analysis Outline Course Description 1 Course Description 2 What s this course about Applied

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Some Aspects of Finite State Channel related to Hidden Markov Process

Some Aspects of Finite State Channel related to Hidden Markov Process Some Aspects of Finite State Channel related to Hidden Markov Process Kingo Kobayashi National Institute of Information and Communications Tecnology(NICT), 4-2-1, Nukui-Kitamachi, Koganei, Tokyo, 184-8795,

More information

Information Theory, Statistics, and Decision Trees

Information Theory, Statistics, and Decision Trees Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220 ECE 564/645 - Digital Communications, Spring 08 Midterm Exam # March nd, 7:00-9:00pm Marston 0 Overview The exam consists of four problems for 0 points (ECE 564) or 5 points (ECE 645). The points for each

More information

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals Information Sources Professor A. Manikas Imperial College London EE303 - Communication Systems An Overview of Fundamentals Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct. 2011 1

More information

Information in Biology

Information in Biology Lecture 3: Information in Biology Tsvi Tlusty, tsvi@unist.ac.kr Living information is carried by molecular channels Living systems I. Self-replicating information processors Environment II. III. Evolve

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information