Massachusetts Institute of Technology

Similar documents
Massachusetts Institute of Technology. Final Exam Solutions

Massachusetts Institute of Technology

Massachusetts Institute of Technology

Massachusetts Institute of Technology. Final Exam Solutions. Solution to Problem 1: Information and Jeopardy (9%)

Lecture 1: Shannon s Theorem

Massachusetts Institute of Technology. Final Exam Solutions. Solution to Problem 1: Knowin the Nomenclature (9%)

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Information in Biology

Massachusetts Institute of Technology

Information in Biology

6.02 Fall 2011 Lecture #9

Bits. Chapter 1. Information can be learned through observation, experiment, or measurement.

Physics 121, April 29, The Second Law of Thermodynamics.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Lecture 21: Quantum communication complexity

1.1 The Boolean Bit. Note some important things about information, some of which are illustrated in this example:

Lecture 18: Quantum Information Theory and Holevo s Bound

6.02 Fall 2012 Lecture #1

Solving with Absolute Value

{ }. The dots mean they continue in that pattern.

Quantum Entanglement and Cryptography. Deepthi Gopal, Caltech

Entanglement Manipulation

{ }. The dots mean they continue in that pattern to both

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Noisy channel communication

Systems of Equations. Red Company. Blue Company. cost. 30 minutes. Copyright 2003 Hanlonmath 1

Quantum Information & Quantum Computation

CS120, Quantum Cryptography, Fall 2016

Lecture 11: Quantum Information III - Source Coding

Massachusetts Institute of Technology. Problem Set 6 Solutions

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution

CHAPTER 4 ENTROPY AND INFORMATION

Channel Encoder. Channel. Figure 5.1: Elaborate communication system

DEPARTMENT OF EECS MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 6.02: Digital Communication Systems, Fall Quiz I. October 11, 2012

Class 22 - Second Law of Thermodynamics and Entropy

Machine Learning, Fall 2009: Midterm

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science ALGORITHMS FOR INFERENCE Fall 2014

Information & Correlation

COMM901 Source Coding and Compression. Quiz 1

to mere bit flips) may affect the transmission.

Chapter 9 Fundamental Limits in Information Theory

Example: sending one bit of information across noisy channel. Effects of the noise: flip the bit with probability p.

Error Detection and Correction: Hamming Code; Reed-Muller Code

Lecture 11 September 30, 2015

Park School Mathematics Curriculum Book 9, Lesson 2: Introduction to Logarithms

Massachusetts Institute of Technology

Entropies & Information Theory

October 18, 2011 Carnot cycle - 1

Chapter 2: Source coding

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?

Stat Mech: Problems II

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Previous Exam Questions, Chapter 2

log 2 N I m m log 2 N + 1 m.

G 8243 Entropy and Information in Probability

UNIT I INFORMATION THEORY. I k log 2

Discrete Mathematics and Probability Theory Summer 2014 James Cook Midterm 1 (Version B)

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

5th March Unconditional Security of Quantum Key Distribution With Practical Devices. Hermen Jan Hupkes

Quantum Information Theory and Cryptography

SS257a Midterm Exam Monday Oct 27 th 2008, 6:30-9:30 PM Talbot College 342 and 343. You may use simple, non-programmable scientific calculators.

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

3F1 Information Theory, Lecture 1

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Quantum Gates, Circuits & Teleportation

Quantum Computing: Foundations to Frontier Fall Lecture 3

Other Topics in Quantum Information

AP Statistics Review Ch. 7

Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices.

Calculator Exam 2009 University of Houston Math Contest. Name: School: There is no penalty for guessing.

June If you want, you may scan your assignment and convert it to a.pdf file and it to me.

Math 6 Common Core. Mathematics Prince George s County Public Schools

Solving Equations by Adding and Subtracting

Entanglement and information

Convolutional Coding LECTURE Overview

1 Secure two-party computation

Building a Computer Adder

Information and Entropy. Professor Kevin Gold

ME264 Thermodynamics

Pr[X = s Y = t] = Pr[X = s] Pr[Y = t]

Information Theory Primer:

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

Lecture 4: Constructing the Integers, Rationals and Reals

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 1

Last few slides from last time

What s the Average? Stu Schwartz

MATH 1101 Exam 1 Review. Spring 2018

Physics is becoming too difficult for physicists. David Hilbert (mathematician)

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

Chapter 12. The Laws of Thermodynamics

CSCI 2570 Introduction to Nanocomputing

Ph 219/CS 219. Exercises Due: Friday 3 November 2006

Physics 239/139 Spring 2018 Assignment 3 Solutions

Note: Please use the actual date you accessed this material in your citation.

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Lecture 16 Oct 21, 2014

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

Transcription:

Name (%): Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2005 Issued: May 20, 2005, :30 PM Final Exam Due: May 20, 2005, 4:30 PM Note: Please write your name at the top of each page in the space provided. The last page may be removed and used as a reference table for the calculation of logarithms. Problem : Who s Who (4%) For each statement, fill in a name from the box that most closely matches. There are no repeated answers. Avogadro Bayes Boltzmann Boole Carnot Gibbs Hamming Huffman Jaynes Joule Kelvin Kraft Lempel Maxwell Morse Reed Schrödinger Shannon Solomon Welsh Ziv a., a 9-th century physicist, has a unit of temperature named after him. b. Reportedly secluded himself in his mountain cabin, with pearls in his ears to muffle the sound, and his girlfriend in his bed to inspire him, and came up with an equation to calculate the wave function of a quantum system. c. The measure of how much two bit strings of equal length differ is named after. d. While an MIT student in the 940s, invented a famous inequality for his master s thesis. e. was one of three engineers whose name is used for a lossless compression technique for the popular GIF image format. f. A famous inequality is named after, who received the first doctorate in engineering in America, and later was on the faculty at Yale University. g. The constant is approximately.38 0 23 Joules per Kelvin. h. The military engineer showed that all reversible heat engines have the same efficiency. i. The channel capacity theorem proved by states a possibility, not how to achieve it. j. The algebra of binary numbers is named after the mathematician, who had been a childhood prodigy in Latin, publishing at the age of 2. k. conceived a Demon to show the statistical nature of the Second Law of Thermodynamics. l. promoted the Principle of Maximum Entropy as an unbiased way of assigning probabilities. m. During an ocean journey heard that electricity could be transmitted instantaneously, and in a fit of creativity invented a code to use it to transmit arbitrary information. n. Over one weekend solved a problem posed in a problem set for an MIT graduate subject, thereby inventing the shortest possible variable length code.

Final Exam; Name: 2 Problem 2: Under Control (5%) Your company has just purchased a large number of C NOT (controlled-not) logic gates which are really just XOR (exclusive or) gates with an extra output. They have two inputs A and B and two outputs C and D. One output, C, is merely a copy of A, and the other output is B if A = 0 or NOT B if A =. In other words, the output D is B possibly run through a NOT gate depending on the input A. Your boss wants you to design all your circuits using only C NOT gates, so the company can save the cost of maintaining different components in its inventory. You wonder about the properties of this gate. You start off by modeling the gate in terms of its transition probabilities, and noting some basic properties. AB CD You know that a gate is reversible if its input can be inferred exactly from its output. Since you know nothing about how the gate might be used, you assume the four possible input combinations are equally probable. 00 0 00 0 a. In the diagram at the right, show the transition probabilities. b. What is the input information in bits? c. What is the output information in bits? d. What is the loss in bits? e. What is the noise in bits? f. What is the mutual information in bits? g. Is this gate reversible (yes or no)? Problem 3: Out of Control (5%) 0 0 You have concluded that the C NOT gate from Problem 2 will not satisfy your needs, and look around for another. The incoming inspector tells you that some of the gates don t work right, and calls them NOT C NOT gates. They behave like C NOT gates except that they have a defective power supply which keeps the C output from being when D = even though the C NOT logic might call for them both to be. This inspector thinks these gates are worthless but asks you want you think. You repeat your analysis for this gate, again assuming the four possible input combinations are equally probable. a. In the diagram at the right, show the transition probabilities. b. What is the input information in bits? c. What is the output information in bits? d. What is the loss in bits? e. What is the noise in bits? f. What is the mutual information in bits? g. Is this gate reversible (yes or no)? AB 00 0 0 CD 00 0 0

Final Exam; Name: 3 Problem 4: MIT Customer Complaint Department (5%) You have recently been elected President of the UA, and it is your job to transmit student complaints to MIT President Susan Hockfield so they (hopefully) can be addressed rapidly. According to the UA s research, all student complaints fall into one of six categories, with percentages shown: % Complaint 50% Not enough homework 30% Campus dining options too diverse 0% Tuition too low 5% Administration too attentive 5% Classes too easy Unfortunately, Hockfield doesn t have much time, so she instructs you to only send very short messages to her. Because you ve taken 6.050 you know about coding schemes, so you decide to encode the complaints above in a Huffman code. a. Design a Huffman code for the complaints above. Complaint Not enough homework Campus dining options too diverse Tuition too low Administration too attentive Classes too easy Code b. What is the average number of bits to send one complaint? Average # of bits/complaint:

Final Exam; Name: 4 Problem 5: Zero Power Cooler (30%) Having landed a high-paying job in Boston, you decide to build a new house for yourself. You vow to use only energy-efficient techniques, based on what you know about information and entropy. Of course your house needs heating in the winter, cooling in the summer, and beverage refrigeration all year long. You decide to design a device that cools the beverage storage room without requiring any work and therefore no electricity bill. Of course you are familiar with the Carnot efficiency results for reversible heat exchangers. You know that ordinarily heat engines, heat pumps, and refrigerators operate between two reservoirs at two different temperatures, and either produce or require work. But in your case you have reservoirs at three different temperatures (storage room, house, and outside). You wonder whether a reversible heat exchange system could take heat out of the storage room and either put heat into or take heat from the other two reservoirs, with no work. a. First consider the winter. Model the outside environment as a large heat reservoir at 275K (about 2 o C or 35 o F ). You want your house to be at 295K (about 22 o C or 72 o F ) and the storage room at 280K (about 7 o C or 44 o F ). Is such a system possible, at least in principle? Or would it violate the Second Law of Thermodynamics? If it is possible, give the heat that would be exchanged with each of the three reservoirs if Joule of heat is taken from the storage room at 280K. If it is not possible, briefly explain why. b. Next, consider the summer, when the outside temperature is 300K (about 27 o C or 80 o F ). The house and storage room temperatures are the same as for the winter, namely 295K and 280K. You wonder whether a reversible heat exchange system can be designed to take energy out of the storage area without requiring any work. Is such a system possible, at least in principle? Or would it violate the Second Law of Thermodynamics? If it is possible, give the heat that would be exchanged with each of the three reservoirs if Joule of heat is taken from the refrigerator at 280K. It is is not possible, briefly explain why.

Final Exam; Name: 5 Problem 6: Casino (20%) On your vacation you find yourself in a gambling casino that caters to nerds. One of the games catches your attention because you wonder about the probabilities. In this game, you pay a nickel (5 cents) to play, and then a coin is chosen at random from a big jar and given to you. The jar contains pennies (worth cent), nickels (5 cents), and quarters (25 cents). This game, unlike slot machines, has some payout each time you play. You ask the owner of the casino about the probabilities P, N, and Q of the three coins (penny, nickel, and quarter) being selected, but the only information he gives you is that the average payout is 4 cents per play. You notice that the jar is so large that P, N, and Q remain the same all day long. a. What values of P, N, and Q (nonnegative, no larger than ) are possible? P Min P P Max N Min N N Max Q Min Q Q Max b. You decide to estimate P, N, and Q using the Principle of Maximum Entropy (if the owner made this choice it would be exciting to nerd customers, because they would gain the most information when learning which coin is chosen). (b) Start this estimate by eliminating P and N and writing the entropy (your uncertainty about the coin to be selected) as a function of Q: Entropy = Without a calculator you can t find the maximum of this expression, so instead you guess that it is about 0.5 bits. Then you realize that the owner need not stock the jar using maximum entropy. He could set the probabilities so as to maximize the number of quarters paid out, while still keeping an average payout of 4 cents (this choice would be exciting to non-nerd customers hoping to strike it rich). c. With this strategy, what are P, N, Q, and the entropy (to two decimal places)? P =, N =, Q =, Entropy =. d. Was your earlier quick guess for the entropy (when you thought the Principle of Maximum Entropy was used by the owner) a good one? Explain your answer. e. (Extra credit) Another strategy for the owner would be to stock the jar with no quarters at all, so Q = 0 (this choice would appeal only to nerd customers). For this strategy, what are P, N, Q, and the entropy (to two decimal places)? P =, N =, Q =, Entropy =.

Final Exam; Name: 6 Problem 7: The Telephone Game (30%) The Telephone Game illustrates how correct information gets converted into false rumors. In the game one person (Alice) sends a message to another (Bob) through a chain of humans, each of whom potentially corrupts the message. Thus Alice tells person #, who then tells person #2, and so on until the last person in the chain tells Bob. Then Bob and Alice announce their versions of the message, normally accompanied by amazement at how different they are. Consider the case where a single bit is being passed and each person in the chain has a 20% probability of passing on a bit different from the one she received. Thus we model each person in the chain as a symmetric binary channel as shown in Figure. 0 x 0 0.8 7 0.8 0 x Figure : Simple model of a person The game is being demonstrated for you at a party one day. Alice and Bob take their positions at opposite ends of the chain, and Alice whispers the value of x 0 to person #. You know that person #, like the other members of the chain, has a 20% probability of changing the bit she hears. Parts a. and b. concern the model for this person, and parts c. and d. concern the behavior of the chain. a. At first, you do not know what Alice has told person #. Naturally, you express your state of knowledge in terms of the two probabilities p(x 0 = 0) and p(x 0 = ). To avoid any unintended bias you use the Principle of Maximum Entropy to conclude that each of these probabilities is equal to 0.5. Then you calculate your uncertainty I 0 about the value of x 0, the output probabilities p(x = 0) and p(x = ), and your uncertainty I about the value of x. Then you calculate the channel noise N, channel loss L, and mutual information M, all in bits: p(x 0 = 0) = 0.5 p(x 0 = ) = 0.5 I 0 = bits p(x = 0) = p(x = ) = I = bits N = L = M =

Final Exam; Name: 7 Next, consider the behavior of a cascade of independent identical channels representing the individuals passing the message, as illustrated in Figure 2. Let x k represent the output of the k-th channel. 0 x 0 0.8 7 0.8 0 x 0.8 7 0.8 0.8 0 0 x 2 x 3... 7 0.8 Figure 2: Simple model of the telephone game b. Now suppose that you know that Alice s bit is 0, so that x 0 = 0. Having just calculated probabilities for x, you wonder how much you know about the other values, x 2, x 3, x 4,... x k.... Is it true that p(x k = 0) > 0.5 for every channel output x k? In other words, is every value passed along more likely to be 0 than? Write a paragraph defending your conclusion. If possible, make this an outline of a proof. You may find some version of the principle of induction helpful. (Pictures and equations are allowed in the paragraph, if needed.)

Final Exam; Name: 8 c. You conclude (correctly) that p(x k = 0) decreases as the message moves along the chain, i.e., p(x 0 = 0) > p(x = 0) > p(x 2 = 0) >... > p(x k = 0) > p(x k+ = 0) >... () Let I k be your uncertainty about x k in bits. Does the uncertainty about x k increase as you move through successive channels? In other words, is the following sequence of inequalities true? Write a paragraph defending your answer. I 0 < I < I 2 <... < I k < I k+ <... (2)

Logarithm and Entropy Table This page is provided so that you may rip it off the exam to use as a separate reference table. In Table, the entropy S = p log 2 (/p) + ( p) log 2 (/( p)). p /8 /5 /4 3/0 /3 3/8 2/5 /2 3/5 5/8 2/3 7/0 3/4 4/5 7/8 log 2 (/p) 3.00 2.32 2.00.74.58.42.32.00 0.74 0.68 0.58 0.5 0.42 0.32 0.8 S 0.54 0.72 0.8 0.88 0.92 0.95 0.97.00 0.97 0.95 0.92 0.88 0.8 0.72 0.54 Table : Table of logarithms in base 2 and entropy in bits 9