Entropies & Information Theory
|
|
- Merilyn Glenn
- 6 years ago
- Views:
Transcription
1 Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on:
2 Quantum Information Theory Born out of Classical Information Theory Mathematical theory of storage, transmission & processing of information Quantum Information Theory: how these tasks can be accomplished using quantum-mechanical systems as information carriers (e.g. photons, electrons, ) Quantum Physics Quantum Information Theory Information Theory
3 The underlying quantum mechanics distinctively new features These can be exploited to: improve the performance of certain information-processing tasks as well as accomplish tasks which are impossible in the classical realm!
4 Classical Information Theory: 1948, Claude Shannon He posed 2 questions: (Q1) What is the limit to which information can be reliably compressed? relevance: physical limit on storage (Q2) What is the maximum amount of information that can be transmitted reliably per use of a communications channel? relevance: noise in communications channels information = data =signals= messages = outputs of a source
5 He posed 2 questions: Classical Information Theory:1948, Claude Shannon (Q1) What is the limit to which information can be reliably compressed? (A1) Shannon s Source Coding Theorem: data compression limit = Shannon entropy of the source (Q2) What is the maximum amount of information that can be transmitted reliably per use of a communications channel? (A2) Shannon s Noisy Channel Coding Theorem: maximum rate of info transmission: given in terms of the mutual information
6 Shannon: information What is information? uncertainty Information gain = decrease in uncertainty of an event measure of information measure of uncertainty Surprisal or Self-information: Consider an event described by a random variable (r.v.) X p( x) (p.m.f); x J (finite alphabet) A measure of uncertainty in getting outcome : ( x): log p( x) a highly improbable outcome is surprising! x log log 2 px ( ) only depends on -- not on values taken by continuous; additive for independent events x X
7 Shannon entropy = average surprisal Defn: Shannon entropy H ( X ) of a discrete r.v. X p( x), H ( X) ( ( X)) px ( )log px ( ) xj Convention: 0log0 0 lim wlog w 0 w0 log log 2 (If an event has zero probability, it does not contribute to the entropy) H ( X ) : a measure of uncertainty of the r.v. also quantifies the amount of info we gain on average when we learn the value of X X x J H ( X) H( p ) H p( x) X p ( ) X p x x J
8 Example: Binary Entropy X p( x) J {0,1} p(0) p; p(1) 1 p; H ( X) plog p(1 p)log(1 p) h( p) Properties p p 0 x 1 h( p) 0 hp ( ) 1 x0 no uncertainty p 0.5 : h( p) 1 Concave function of Continuous function of maximum uncertainty p p
9 Operational Significance of the Shannon Entropy = optimal rate of data compression for a classical memoryless (i.i.d.) information source Classical Information Source Outputs/signals : sequences of letters from a finite set J : source alphabet (i) binary alphabet J {0,1} (ii) telegraph English : 26 letters + a space (iii) written English : 26 letters in upper & lower case + punctuation J
10 Simplest example: a memoryless source successive signals: independent of each other characterized by a probability distribution On each use of the source, a letter u J pu Modelled by a sequence of i.i.d. random variables U p( u) ( ) u J emitted with prob U, U,..., U u J 1 2 n i pu ( ) PU ( u), uj1 kn. k p( u) Signal emitted by uses of the source: n u ( u, u,..., u ) u 1 2 n ( n) p( u) P( U u, U u,..., U u ) n n 1 2 p( u ) p( u )... p( u n ) Shannon entropy of the source: HU ( ): pu ( )log pu ( ) uj
11 (Q) Why is data compression possible? (A) There is redundancy in the info emitted by the source -- an info source typically produces some outputs more frequently than others: In English text e occurs more frequently than z. --during data compression one exploits this redundancy in the data to form the most compressed version possible Variable length coding: -- more frequently occurring signals (e.g e ) assigned shorter descriptions (fewer bits) than the less frequent ones (e.g. z ) Fixed length coding: -- identify a set of signals which have high prob of occurrence: typical signals -- assign unique fixed length (l) binary strings to each of them -- all other signal (atypical) assigned a single binary string of same length (l)
12 Typical Sequences Defn: Consider an i.i.d. info source : 0, U, U,... U ; p( u) ; u J 1 2 For any sequences 1 2 n n u: ( u, u,... u ) J for which n( H( U) ) n( H( U) ) 2 pu ( 1, u2,... u n ) 2, where HU ( ) are called typical sequences n Shannon entropy of the source ( n T ) : typical set = set of typical sequences Note: Typical sequences are almost equiprobable u ( n T ), pu ( ) 2 nh U ( )
13 Operational Significance of the Shannon Entropy (Q) What is the optimal rate of data compression for such a source? [ min. # of bits needed to store the signals emitted per use of the source] (for reliable data compression) Optimal rate is evaluated in the asymptotic limit n n number of uses of the source One requires p ( n) error 0 ; n (A) optimal rate of data compression = HU ( ) Shannon entropy of the source
14 Compression-Decompression Scheme U, U,... U ; U p( u) ; u J Suppose is an i.i.d. information 1 2 n i Shannon entropy source A compression scheme of rate When is this a compression scheme? Decompression: Average probability of error: H ( U ) R : En : u: ( u1, u2,... u n ) 1 2 n J mn D n Compr.-decompr. scheme reliable if m : 0,1 n ( n) p av x : ( x, x,... x mn ) ( n) av nr J n m 0,1 n pup ( ) Dn( En( u)) u u p 0 as n
15 Shannon s Source Coding Theorem: U, U,... U ; U p( u) ; u J Suppose is an i.i.d. information 1 2 n i Shannon entropy source H ( U ) R Suppose : then there exists a reliable compression scheme of rate H( U) R for the source. R H( U) If then any compression scheme of rate will not be reliable. R
16 Entropies for a pair of random variables Consider a pair of discrete random variables X p( x) ; x JX and Y p( y) ; y JY Given their joint probabilities & their conditional probabilities P( X x, Y y) p( x, y) ; P( Y y X x) p( y x) ; Joint entropy: H( X, Y): p( x, y) log p( x, y) xj X yj Y Conditional entropy: HY ( X): pxhy ( ) ( X x) xj X xj X px ( ) py ( x) log py ( x) yjy xj X yj Y HY ( X) 0 p( xy, ) log py ( x)
17 Entropies for a pair of random variables Relative Entropy: Measure of the distance between two probability distributions p p( x) ; q q( x) xj p( x) D( p q): p( x)log xj qx ( ) xj convention: 0 u 0log 0 ; ulog u 0 u 0 D( p q) 0 D( p q) 0 if & only if p q BUT not a true distance not symmetric; does not satisfy the triangle inequality
18 Entropies for a pair of random variables Mutual Information: Measure of the amount of info one r.v. contains about another r.v. X p( x), Y p( y) pxy (, ) I( X, Y): p( x, y)log xy, p( x) p( y) I( X : Y) D( p p p ) XY X Y p pxy (, ) ; p px ( ) ; p py ( ) XY xy, X x Y y Chain rules: H( X, Y) H( Y X) H( X) I( X : Y) H( X) H( Y) H( X, Y) H( X) H( X Y) HY ( ) HY ( X)
19 Properties of Entropies Let X p( x), Y p( y) be discrete random variables: Then, H( X) 0, with equality if & only if is deterministic H ( X) log J, if x J Subadditivity: H( X, Y) H( X) H( Y), p X Concavity: if & are 2 prob. distributions, H ( p (1 ) p ) H( p ) (1 ) H( p ), X Y X Y HY ( X) 0, or equivalently H( X, Y) H( Y), p Y X I( X : Y) 0 with equality if & only if & X are independent Y
20 So far. Classical Data Compression: answer to Shannon s question (Q1) What is the limit to which information can be reliably compressed? (A1) Shannon s Source Coding Theorem: data compression limit = Shannon entropy of the source 1 st Classical entropies and their properties
21 Shannon s 2 nd question (Q2) What is the maximum amount of information that can be transmitted reliably per use of a communications channel? The biggest hurdle in the path of efficient transmission of info is the presence of noise in the communications channel Noise distorts the information sent through the channel. input noisy channel output output input To combat the effects of noise: use error-correcting codes
22 To overcome the effects of noise: instead of transmitting the original messages, -- the sender encodes her messages into suitable codewords -- these codewords are then sent through (multiple uses of) the channel Alice Bob Alice s message encoding En codeword input n N ( n) uses of N output decoding Dn Bob s inference Error-correcting code: C : ( E, D ): n n n
23 The idea behind the encoding: To introduce redundancy in the message so that upon decoding, Bob can retrieve the original message with a low probability of error: The amount of redundancy which needs to be added depends on the noise in the channel
24 Memoryless binary symmetric channel (m.b.s.c.) 0 1 p p 1-p 1-p Encoding: it transmits single bits effect of the noise: to flip the bit with probability p codewords the 3 bits are sent through 3 successive uses of the m.b.s.c. Suppose m.b.s.c. codeword Example Decoding : (majority voting) Repetition Code (Bob receives) (Bob infers)
25 Probability of error for the m.b.s.c. : without encoding = p with encoding = Prob (2 or more bits flipped) := q inference possible inputs output of 3 uses of a m.b.s.c. 010 Prove: q < p if p < 1/2 -- in this case encoding helps! (Encoding Decoding) : Repetition Code.
26 Information transmission is said to be reliable if: -- the probability of error in decoding the output vanishes asymptotically in the number of uses of the channel Aim: to achieve reliable information transmission whilst optimizing the rate the amount of information that can be sent per use of the channel The optimal rate of reliable info transmission: capacity
27 Discrete classical channel N J X input alphabet; JY output alphabet n uses of N input ( n) x ( n ) n x J X N ( n) output ( ) ( ) ( n n py x ) ( n) ( n) y y J Y n py ( n) ( n) ( x ) conditional probabilities ; known to sender & receiver
28 Correspondence between input & output sequences is not 1-1 x ( n ) n J X y ( n ) n J Y ( ) x ' n Shannon proved: it is possible to choose a subset of input sequences-- such that there exists only : 1 highly likely input corresponding to a given input Use these input sequences as codewords
29 Transmission of info through a classical channel Alice s Alice M : m M Alice s message codeword: output: finite set of messages encoding En ( n) ( n) x input N noisy channel n N ( n) uses of x ( x1, x2,..., xn ); ( n) y ( y, y,..., y ); 1 2 n N ( n) y output N ( n ) : decoding Dn p y Bob m M Bob s inference ( n) ( n) ( x ) Error-correcting code: C : ( E, D ): n n n
30 m M Alice s message If m m encoding En ( n) x input ( n) N then an error occurs! ( n) y output decoding Info transmission is reliable if: Prob. of error 0 Dn as m M Bob s inference n Rate of info transmission = number of bits of message transmitted per use of the channel Aim: achieve reliable transmission whilst maximizing the rate Shannon: there is a fundamental limit on the rate of reliable info transmission ; property of the channel Capacity: maximum rate of reliable information transmission
31 Shannon in his Noisy Channel Coding Theorem: -- obtained an explicit expression for the capacity of a memoryless classical channel n ( n) ( n) ( ) ( i i) i1 p y x p y x Memoryless (classical or quantum) channels action of each use of the channel is identical and it is independent for different uses -- i.e., the noise affecting states transmitted through the channel on successive uses is assumed to be uncorrelated.
32 Classical memoryless channel: a schematic representation X px ( ) N input x output y x J, p( y x) X y J Y, Y channel: a set of conditional probs. p( y x) Capacity input distributions C( N ) max I( X : Y) px ( ) I( X : Y) H( X) H( Y) H( X, Y) Shannon Entropy mutual information H( X) px ( )log px ( ) x
33 Shannon s Noisy Channel Coding Theorem: For a memoryless channel: X px ( ) input N p( y x) Y output Optimal rate of reliable info transmission capacity C( N ) max I( X : Y) px ( ) Sketch of proof
34 Supposedly von Neumann asked Shannon to call this quantity entropy, saying You should call it entropy for two reasons; first, the function is already in use in thermodynamics; second, and most importantly, most people don t know what entropy really is, & if you use entropy in an argument, you will win every time.
35 Relation between Shannon entropy & thermodynamic entropy S : klog th Suppose the microstate occurs with prob.. consider replicas of the system th Then on average replicas are in the state Total # of microstates Th.dyn.entropy of compound system of replicas v r vr [ vpr] v v! v v v v v v v v1 v2 vr!!..! r 1 2 S kv p log p v r r r r (by Stirling s) S S / v k p log p k H p r Shannon v r r r v total # of microstates r p r entropy
Entropies & Information Theory
Etropies & Iformatio Theory LECTURE I Nilajaa Datta Uiversity of Cambridge,U.K. For more details: see lecture otes (Lecture 1- Lecture 5) o http://www.qi.damtp.cam.ac.uk/ode/223 Quatum Iformatio Theory
More informationNoisy channel communication
Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University
More informationIntroduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.
L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission
More informationQuantum rate distortion, reverse Shannon theorems, and source-channel separation
Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, Mark Wilde (1) University of Cambridge,U.K. (2) McGill University, Montreal, Canada Classical
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More information9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.
9. Distance measures 9.1 Classical information measures How similar/close are two probability distributions? Trace distance Fidelity Example: Flipping two coins, one fair one biased Head Tail Trace distance
More informationPrinciples of Communications
Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationBlock 2: Introduction to Information Theory
Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationto mere bit flips) may affect the transmission.
5 VII. QUANTUM INFORMATION THEORY to mere bit flips) may affect the transmission. A. Introduction B. A few bits of classical information theory Information theory has developed over the past five or six
More informationQuantum Information Theory
Chapter 5 Quantum Information Theory Quantum information theory is a rich subject that could easily have occupied us all term. But because we are short of time (I m anxious to move on to quantum computation),
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationIntroduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2
Introduction to Information Theory B. Škorić, Physical Aspects of Digital Security, Chapter 2 1 Information theory What is it? - formal way of counting information bits Why do we need it? - often used
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationChapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality
More information3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions
Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationAQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013
AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case
More informationLecture 18: Quantum Information Theory and Holevo s Bound
Quantum Computation (CMU 1-59BB, Fall 2015) Lecture 1: Quantum Information Theory and Holevo s Bound November 10, 2015 Lecturer: John Wright Scribe: Nicolas Resch 1 Question In today s lecture, we will
More informationModule 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur
Module ntroduction to Digital Communications and nformation Theory Lesson 3 nformation Theoretic Approach to Digital Communications After reading this lesson, you will learn about Scope of nformation Theory
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More information1 Introduction to information theory
1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through
More informationInformation in Biology
Lecture 3: Information in Biology Tsvi Tlusty, tsvi@unist.ac.kr Living information is carried by molecular channels Living systems I. Self-replicating information processors Environment II. III. Evolve
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationInformation Theory Primer:
Information Theory Primer: Entropy, KL Divergence, Mutual Information, Jensen s inequality Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationProblem Set: TT Quantum Information
Problem Set: TT Quantum Information Basics of Information Theory 1. Alice can send four messages A, B, C, and D over a classical channel. She chooses A with probability 1/, B with probability 1/4 and C
More informationLecture 1: Introduction, Entropy and ML estimation
0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual
More informationEE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018
Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationInformation Theory CHAPTER. 5.1 Introduction. 5.2 Entropy
Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationInformation in Biology
Information in Biology CRI - Centre de Recherches Interdisciplinaires, Paris May 2012 Information processing is an essential part of Life. Thinking about it in quantitative terms may is useful. 1 Living
More information4 An Introduction to Channel Coding and Decoding over BSC
4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationNoisy-Channel Coding
Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.
More informationQuantum Information Theory and Cryptography
Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationITCT Lecture IV.3: Markov Processes and Sources with Memory
ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationBandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)
Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner
More informationQuantum Information Chapter 10. Quantum Shannon Theory
Quantum Information Chapter 10. Quantum Shannon Theory John Preskill Institute for Quantum Information and Matter California Institute of Technology Updated January 2018 For further updates and additional
More information5 Mutual Information and Channel Capacity
5 Mutual Information and Channel Capacity In Section 2, we have seen the use of a quantity called entropy to measure the amount of randomness in a random variable. In this section, we introduce several
More informationCS 630 Basic Probability and Information Theory. Tim Campbell
CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)
More informationTransmitting and Hiding Quantum Information
2018/12/20 @ 4th KIAS WORKSHOP on Quantum Information and Thermodynamics Transmitting and Hiding Quantum Information Seung-Woo Lee Quantum Universe Center Korea Institute for Advanced Study (KIAS) Contents
More informationClassification & Information Theory Lecture #8
Classification & Information Theory Lecture #8 Introduction to Natural Language Processing CMPSCI 585, Fall 2007 University of Massachusetts Amherst Andrew McCallum Today s Main Points Automatically categorizing
More informationComputing and Communications 2. Information Theory -Entropy
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationLecture 5 - Information theory
Lecture 5 - Information theory Jan Bouda FI MU May 18, 2012 Jan Bouda (FI MU) Lecture 5 - Information theory May 18, 2012 1 / 42 Part I Uncertainty and entropy Jan Bouda (FI MU) Lecture 5 - Information
More informationShannon's Theory of Communication
Shannon's Theory of Communication An operational introduction 5 September 2014, Introduction to Information Systems Giovanni Sileno g.sileno@uva.nl Leibniz Center for Law University of Amsterdam Fundamental
More information3F1 Information Theory, Lecture 3
3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free
More informationEntropy in Classical and Quantum Information Theory
Entropy in Classical and Quantum Information Theory William Fedus Physics Department, University of California, San Diego. Entropy is a central concept in both classical and quantum information theory,
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationUNIT I INFORMATION THEORY. I k log 2
UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationLecture 6: Quantum error correction and quantum capacity
Lecture 6: Quantum error correction and quantum capacity Mark M. Wilde The quantum capacity theorem is one of the most important theorems in quantum hannon theory. It is a fundamentally quantum theorem
More informationMAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)
MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)
More information(Classical) Information Theory II: Source coding
(Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationDigital communication system. Shannon s separation principle
Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation
More informationExample: sending one bit of information across noisy channel. Effects of the noise: flip the bit with probability p.
Lecture 20 Page 1 Lecture 20 Quantum error correction Classical error correction Modern computers: failure rate is below one error in 10 17 operations Data transmission and storage (file transfers, cell
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited
Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance
More informationLecture 1: Shannon s Theorem
Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work
More informationClassical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006
Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Fabio Grazioso... July 3, 2006 1 2 Contents 1 Lecture 1, Entropy 4 1.1 Random variable...............................
More informationELEMENT OF INFORMATION THEORY
History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationEntropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory
Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory 1 The intuitive meaning of entropy Modern information theory was born in Shannon s 1948 paper A Mathematical Theory of
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationEE 376A: Information Theory Lecture Notes. Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi
EE 376A: Information Theory Lecture Notes Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi January 6, 206 Contents Introduction. Lossless Compression.....................................2 Channel
More informationInformation Theory. M1 Informatique (parcours recherche et innovation) Aline Roumy. January INRIA Rennes 1/ 73
1/ 73 Information Theory M1 Informatique (parcours recherche et innovation) Aline Roumy INRIA Rennes January 2018 Outline 2/ 73 1 Non mathematical introduction 2 Mathematical introduction: definitions
More information3F1 Information Theory, Lecture 1
3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationRevision of Lecture 5
Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information
More informationMAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A
MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC
More informationList Decoding of Reed Solomon Codes
List Decoding of Reed Solomon Codes p. 1/30 List Decoding of Reed Solomon Codes Madhu Sudan MIT CSAIL Background: Reliable Transmission of Information List Decoding of Reed Solomon Codes p. 2/30 List Decoding
More informationSolutions to Homework Set #3 Channel and Source coding
Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)
More informationInformation Theory - Entropy. Figure 3
Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system
More informationFRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY
FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY Emina Soljanin Mathematical Sciences Research Center, Bell Labs April 16, 23 A FRAME 1 A sequence {x i } of vectors in a Hilbert space with the property
More informationAn instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1
Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,
More informationComplex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity
Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity Eckehard Olbrich MPI MiS Leipzig Potsdam WS 2007/08 Olbrich (Leipzig) 26.10.2007 1 / 18 Overview 1 Summary
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationComputational Systems Biology: Biology X
Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA L#8:(November-08-2010) Cancer and Signals Outline 1 Bayesian Interpretation of Probabilities Information Theory Outline Bayesian
More informationMARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for
MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationEntropy as a measure of surprise
Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify
More information