Massachusetts Institute of Technology

Similar documents
Massachusetts Institute of Technology

Massachusetts Institute of Technology

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

Massachusetts Institute of Technology

Massachusetts Institute of Technology. Problem Set 6 Solutions

16.36 Communication Systems Engineering

Noisy channel communication

LECTURE 13. Last time: Lecture outline

Introduction to Information Theory. Part 4

6.02 Fall 2012 Lecture #1

DEPARTMENT OF EECS MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 6.02: Digital Communication Systems, Fall Quiz I. October 11, 2012

ITCT Lecture IV.3: Markov Processes and Sources with Memory

Example: sending one bit of information across noisy channel. Effects of the noise: flip the bit with probability p.

Entropies & Information Theory

Shannon's Theory of Communication

Massachusetts Institute of Technology

CSCI 2570 Introduction to Nanocomputing

Exercise 1. = P(y a 1)P(a 1 )

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008

An introduction to basic information theory. Hampus Wessman

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Lecture 1: Shannon s Theorem

Autumn Coping with NP-completeness (Conclusion) Introduction to Data Compression

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 11. Error Correcting Codes Erasure Errors

Chapter 9 Fundamental Limits in Information Theory

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

4 An Introduction to Channel Coding and Decoding over BSC

Massachusetts Institute of Technology. Final Exam Solutions. Solution to Problem 1: Knowin the Nomenclature (9%)

Channel Coding and Interleaving

Chapter 2 Review of Classical Information Theory

ECE Information theory Final

Information Theory and Coding Techniques

Design and Analysis of Algorithms March 12, 2015 Massachusetts Institute of Technology Profs. Erik Demaine, Srini Devadas, and Nancy Lynch Quiz 1

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

UNIT I INFORMATION THEORY. I k log 2

LECTURE 5 Noise and ISI

Information Theory - Entropy. Figure 3

DESIGN AND IMPLEMENTATION OF ENCODERS AND DECODERS. To design and implement encoders and decoders using logic gates.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

to mere bit flips) may affect the transmission.

X 1 : X Table 1: Y = X X 2

CSE468 Information Conflict

exercise in the previous class (1)

Infor mation Theor y. Outline

Lecture 2. Capacity of the Gaussian channel

Quiz 2 Date: Monday, November 21, 2016


6.3 Bernoulli Trials Example Consider the following random experiments

Channel Coding: Zero-error case

Introduction to Algorithms March 10, 2010 Massachusetts Institute of Technology Spring 2010 Professors Piotr Indyk and David Karger Quiz 1

Convolutional Coding LECTURE Overview

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

6.02 Fall 2011 Lecture #9

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

CPE100: Digital Logic Design I

Do Neurons Process Information Efficiently?

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

One Lesson of Information Theory

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 6

Assume that the follow string of bits constitutes one of the segments we which to transmit.

LECTURE 5 Noise and ISI

Lecture 5b: Line Codes

Linear Codes and Syndrome Decoding

Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG

ECE533 Digital Image Processing. Embedded Zerotree Wavelet Image Codec

log 2 N I m m log 2 N + 1 m.

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

2.830J / 6.780J / ESD.63J Control of Manufacturing Processes (SMA 6303) Spring 2008

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?

Lecture 4 Noisy Channel Coding

A Simple Introduction to Information, Channel Capacity and Entropy

Lecture 2: August 31

Stat Mech: Problems II

EE5356 Digital Image Processing. Final Exam. 5/11/06 Thursday 1 1 :00 AM-1 :00 PM

12.4 Known Channel (Water-Filling Solution)

Lecture 8: Shannon s Noise Models

Physical Layer and Coding

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Join. Visit us at the HMC club fair and the 5c turf dinner. Company sponsored tech talks and events Resume Workshops / Interview Prep

9 Forward-backward algorithm, sum-product on factor graphs

EE5356 Digital Image Processing

Information Theory, Statistics, and Decision Trees

Compression and Coding

EE 229B ERROR CONTROL CODING Spring 2005

(Classical) Information Theory III: Noisy channel coding

ELEC546 Review of Information Theory

Basic information theory

Information and Entropy

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Quantum Information Theory and Cryptography

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lectures on Medical Biophysics Department of Biophysics, Medical Faculty, Masaryk University in Brno. Biocybernetics

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Transcription:

Name: Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.5J/2.J Information and Entropy Spring 24 Issued: April 2, 24, : PM Quiz Due: April 2, 24, 2: PM Note: Please write your name at the top of each page in the space provided. The last page may be removed and used as a reference table for the calculation of logarithms. Problem : Hot Stuff (35%) The NanoFurnace Company makes heaters for assembly lines. Their product looks at objects passing by, and warms them up if they are too cold. The furnace control system has two parts. First there is a sensor, which is supposed to output a negative voltage N when the object is cold, and a positive voltage P if the object is hot. Then there is an inference program that sets the furnace command F to if the furnace is needed and otherwise. Unfortunately the sensors are cheap but unreliable. Every cold object is recognized and leads to a negative voltage, but only 5/8 (62.5%) of the hot objects are correctly identified. Since for all the customers the NanoFurnace company serves, 8% of the objects are already hot enough, and 2% are cold, this results in a large number of errors. NanoFurnace s programmer, Ben Bitdiddle, wrote the inference program so it would compensate for this problem. It takes as input whether the voltage is positive or negative, and produces a logical value if and only if the object temperature is more likely to be cold than hot for this voltage level. Of course, Ben knew the probabilities of objects being hot and cold (p(h) =.8, p(c) =.2) and the various sensor probabilities (for example, p(p H) =.625). a. Some customers have complained and the company has asked you for help. Your first step is to model the sensor as a nondeterministic process, as in the diagram below. Complete the following figure by indicating the transition probabilities and the output probabilities. p(c) =.2 a P C a NC p(n) = p(h) =.8 7 a NH a P H p(p ) = a P C a NH a NC a P H

Quiz; Name: 2 b. Your knowledge of information and entropy helps you give the input information I in, output information I out, noise N, loss L, and mutual information M of the sensor, all in bits. I in = L = M = N = I out = c. You also figure out what Ben s program tells the furnace when a positive voltage is observed, and when a negative voltage is observed. Model this program as a process with possible inputs P and N, and output F. Add the transition arrows and probabilities to the diagram below, and insert the probabilities at the input and output (only put in transition arrows with nonzero probabilities). p(n) = p(f = ) = p(p ) = p(f = ) = d. What is the input information, loss, mutual information, noise, and output information of Ben s program in bits? I in = L = M = N = I out = e. Explain in words why the customers might have complained.

Quiz; Name: 3 Problem 2: A Pretty Bad Channel (3%) A communication channel can transmit bit per second with the error probabilities given in the figure below. (This is the binary symmetric channel we have studied, with error probabilities that are independent from one transmission to another.).75.25.25 7.75 a. Find the channel capacity of this channel in bits per second. C = To reduce the probability of error we introduce triple redundancy coding at the transmitter: a zero is encoded as and a is encoded as. A corresponding decoder is installed at the receiver end. Channel Coder Channel Channel Decoder b. Find the probability of zero, one, two, or three errors when a sequence of three bits is sent through b. Find the probability of zero, one, two, or three errors when sequence of three bits is sent through the channel (assume the errors are independent). the channel (assume the errors are independent). p( errors) p( error) p(2 errors) p(3 errors) p( errors) p( error) p(2 errors) p(3 errors) c. The decoder is a single-error-correcting decoder: it puts out the correct bit that was transmitted c. The decoder is single-error-correcting decoder: it puts out the correct bit that was transmitted if there is either or error in transmission through the channel. Give the output of this decoder if there is either or error in transmission through the channel. Give the output of this decoder for each possible 3-bit input. for each possible 3-bit input. d. Find the probability that the complete system makes an error if a is transmitted. d. Find the probability that the complete system makes an error if is transmitted. p(error ) = p(error )

Quiz; Name: 4 Problem 3: Building Constraints (35%) MIT has gone through a spurt of growth recently. There are so many new buildings that you have lost track. There are three types of buildings: cheap (cost $5M), expensive (cost $2M), and ridiculous (cost $3M). Your relative visit and you give them a tour of campus. While showing them around, you stop in front of one of these new buildings. They are curious how much it cost, but since you don t know in detail, so you use the principles you learned in Information and Entropy to help you cope with your uncertainty. You use probabilities to express your state of knowledge. Let C, E, and R stand for the probabilities the building is cheap, expensive, or ridiculous. a. In the absence of other information, what set of probabilities C, E, and R express your knowledge without any additional assumptions? C E R b. You remember reading in The Tech that the average cost of all the new buildings is $2M. Write a constraint equation that relates C, E, and R that incorporates this new information. Constraint Equation: c. What values of C, E, and R are compatible with this information and also the fact that C, E, and R must add up to? C min C max E min E max R min R max d. You know that the probabilities that are consistent with your knowledge but have no additional assumptions is found by expressing your uncertainty as a function of one of the probabilities and then finding where its maximum occurs. Write this formula as a function of any one of the three probabilities C, E, and R. (You do not need to find this maximum point.) Equation for the Entropy:

Logarithm and Entropy Table This page is provided so that you may rip it off the quiz to use as a separate reference table. In Table Q, the entropy S = p log(/p) + ( p) log 2 (/( p)). p /8 /5 /4 3/ /3 3/8 2/5 /2 3/5 5/8 2/3 7/ 3/4 4/5 7/8 log 2 (/p) 3. 2.32 2..74.58.42.32..74.68.58.5.42.32.8 S.54.72.8.88.92.95.97..97.95.92.88.8.72.54 Table Q : Table of logarithms in base 2 and entropy in bits 5