3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Size: px
Start display at page:

Download "3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions"

Transcription

1 Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the following table: x, y P XY x, y,.,.3,.,.4 Express the following quantities in terms of the binary entropy function hx = x log x x log x. a HX We marginalise P X = P XY, + P XY, = / and hence HX = h/ = [bits]. b HY Similarly, P Y = P XY, + P XY, =.3 and hence HY = h.3. c HX Y We compute and and hence P X Y = P XY, P Y P X Y = P XY, P Y =..3 = 3 =.3.7 = 3 7 HX Y = P Y HX Y = + P Y HX Y = =.3h/3 +.7h3/7.

2 d HY X We compute and and hence P Y X = P XY, P X P Y X = P XY, P X =..5 = 5 =..5 = 5 HY X = P X HY X = + P X HY X = =.5h/5 +.5h/5. e HXY Using the chain rule, we can compute either or HXY = HX + HY X = +.5h/5 +.5h/5 =.8464 [bits] HXY = HY + HX Y = h.3 +.3h/3 +.7h3/7 = [bits] which, as can be observed, yields the same result. f IX; Y We can write either or IX; Y = HX HX Y =.3h/3.7h3/7 =.349 [bits] IX; Y = HY HY X = h.3.5h/5.5h/5 =.349 [bits] which, as can again be observed, yields the same result.. Let an N-ary random variable X be distributed as follows: { P X = p P X k = p for k =, 3,..., N. N Express the entropy of X in terms of the binary entropy function h.. p HX = p log p N N log p N p = p log p p log N = p log p p log p + p logn = hp + p logn.

3 3. A discrete memoryless source has an alphabet of eight letters, x i, i =,,, 8 with probabilities.5,.,.5,.,.,.8,.5 and.5. a Use the Huffman encoding to determine a binary code for the source output. The diagram below shows a the working to produce the Huffman code. The order in which the probabilities are merged is shown in brackets above the probabilities / / / / 4 / / 3 / / / / / /. --/ / / / / / /.5 --/ Labelling the upper branch and the lower branch gives the code: Symbol Code x x x 3 x 4 x 5 x 6 x 7 x 8 Note that other labellings are possible and also there is a choice at the second merge since there are two probabilities of. at that time. b Determine the average codeword length L Average codeword length: L = =.83 bits. 3

4 c Determine the entropy of the source and hence its efficiency HS =.5. ln ln ln ln. +.. ln ln ln.5 =.798 bits. The average codeword length is greater than the entropy as expected. Efficiency =.798 /.83 = 98.9%. 4. Show that for statistically independent events HX, X,, X n = n HX i i= First expand out HX, X,, X n HX, X,, X n = N N i = i = N n i n= The probabilities are independent so this is = N N i = i = N n i n= P x i, x i, x in log P x i, x i, x in n n P x ij log P x ij The last summation only indexes i n so we can move all the other terms in the first product to the left of it. We can also replace the log of a product with the sum of the logs. = N N i = i = N n i n = n P x ij Nn n P x in log P x ij i n= All terms in the right hand sum except the last one do not depend on i n and since Nn i n= P x i n = we can transform N n i n= n n P x in log P x ij = log P x ij + Now the right hand term is independent of i i n and N N i = i = N n i n = n P x ij = N n i n= P x in log P x in 4

5 so we have HX, X,, X n = N n i n= P x in log P x in + N N i = i = = HX n + HX, X,, X n by induction this gives N n i n = n P x ij n log P x ij = n HX i i= 5. A five-level non-uniform quantizer for a zero-mean signal results in the 5 levels b, a,, a, b with corresponding probabilities of occurrence p b = p b =.5, p a = p a =. and p =.7. a Design a Huffman code that encodes one signal sample at a time and determine the average bit rate per sample \ \ -a. --\ \ / a. --/ \ / / -b.5 --\ / -. --/ b.5 --/ Symbol Code giving code: a a b b Average bit rate = =.6 bits. Another version of the code would give bits to a or to a and 4 bits each to b and b. It has the same mean length. b Design a Huffman code that encodes two output samples at a time and determine the average bit rate per sample. We combine the 5 states into 5 pairs of states, in descending order of probability and form a Huffman code tree as follows. 5

6 Note that there are many different possible versions of this tree as many of the nodes have equal probabilities and so can be taken in arbitrary order. However the mean length and hence performance of the code should be the same in all cases. The code lengths are shown on the left, but should be filled in last, according to the number of branch-points in the tree that are passed, going from the root to each code state. len samp prob \ 4 -a.7 --\ \ 4 a \ \ 4 -a.7 --\ \ \ 4 a / \ \ 5 -b \ \ \ Root 5 b.35 -/ \ \. 5 -b / \ \ / 5 b.35 -/ \.5 / 6 -a-a \ \ / 7 a-a \.3 / 7 -a a. -/ \ / 6 a a / \ / 7 -a-b / \ / 7 a-b.5 -/.9 / 7 -a b \ / 7 a b.5 -/. \ / 7 -b-a / \ / 7 b-a.5 -/.4 / 7 -b a \ / 7 b a.5 -/. / 8 -b-b \ / 8 b-b.5 /. / 8 -b b / 8 b b.5 / Summing probabilities for codewords of the same length: Av. codeword length = =.93 bits per pair of samples. Hence code rate =.93/ =.465 bit / sample c What are the efficiencies of these two codes? Entropy of the message source =.7 log.7+.. log.+..5 log.5 =.4568 bit / sample Hence: Efficiency of code = = 9.5% Efficiency of code = = 99.44% Code represents a significant improvement, because it eliminates the zero state of code which has a probability well above While we cover in 3F and 4F5 the application of Shannon s theory to data compression and transmission, Shannon also applied the concepts 6

7 of entropy and mutual information to the study of secrecy systems. The figure below shows a cryptographic scenario where Alice wants to transmit a secret plaintext message X to Bob and they share a secret key Z, while the enemy Eve has access to the public message Y. Message Source Key Source X Alice Encrypter Z Eve Enemy Cryptanalyst Y Bob Decrypter Z X Message Destination a Write out two conditions using conditional entropies involving X,Y and Z to enforce the deterministic encryptability and decryptability of the messages. Hint: the entropy of a function given its argument is zero, e.g., for any random variable X, HfX X =. The conditions are HY XZ = because the ciphertext is a function of the secret plaintext message and the secret key, and HX Y Z = because the secret plaintext message can be inferred from the ciphertext and the key. b Shannon made the notion of an unbreakable cryptosystem precise by saying that a cryptosystem provides perfect secrecy if the enemy s observation is statistically independent of the plaintext, i.e., IX; Y =. Show that this implies Shannon s much cited bound on key size HZ HX, i.e., perfect secrecy can only be attained if the entropy of the key and hence its compressed length is at least as large as the entropy of the secret plaintext. 7

8 Since IX; Y =, we have HX = HX Y = HX, Z Y HZ X, Y HX, Z Y = HZ Y + HX Z, Y = HZ Y HZ c Vernam s cipher assumes a binary secret plaintext message X with any probability distribution P X = p = P X and a binary secret key Z that s uniform P Z = P Z = / and independent of X. The encrypter simply adds the secret key to the plaintext modulo, and the decrypter by adding the same key to the ciphertext can recover the plaintext. Show that Vernam s cipher achieves perfect secrecy, i.e., IX; Y =. We have P Y = P XY Z x,, z x z = P Y XZ x, zp X xp Z z x z and note that P Y XZ x, z = if x + z mod =, and P Y XZ x, z = otherwise, and so the expression above becomes P Y = p + p = and hence Furthermore, HY = [bits]. P Y X = P XY, P X = P XY Z,, + P XY Z,, P X = P XP Z P Y XZ, + P X P Z P Y XZ, P X = p + = p 8

9 and similarly P Y X = P XY, P X = P XY Z,, + P XY Z,, P X = P XP Z P Y XZ, + P X P Z P Y XZ, P X and hence, = + p p = HY X = P X HY X = +P X HY X = = ph/+ ph/ = [bits] and so, finally, IX; Y = HY HY X = = [bits]. 7. What is the entropy of the following continuous probability density functions? x < a P x =.5 < x < x > HX =.5 log.5 dx = log.5 b P x = λ e λ x HX = = = = λ lnλ/ = lnλ/ = log 4 = λ λ e λ x ln λ λ e λx ln λ e λx + λ 9 ln e λ x e λx λ e λx dx + dx dx λx dx λ λxe λx dx xe λx dx

10 integrating by parts with u = x and v = λe λx so v = e λx : = lnλ/ + λ [ xe ] λx e λx dx c P x = σ π e x /σ HX = = = lnλ/ = lnλ/ = lnλ/ = lnσ π = lnσ π σ π σ π σ π + λ + e x /σ e x /σ + σ π + λ ln ln /σ e x dx + σ /σ e x π x σ π dx σ σ π x x σ e x /σ dx x /σ e x dx σ integrating by parts with u = x and v = x σ e x /σ giving v = e x /σ : = lnσ π = lnσ π = lnσ π [ + σ xe x /σ ] π + σ + σ π π + = log σ π + log e = log σ πe e x /σ dx 8. Continuous variables X and Y are normally distributed with standard deviation σ =. P x = π e x / P y = π e y / A variable Z is defined by z = x + y. What is the mutual information of X and Z?

11 IZ; X = HZ HZ X Z is a normally distributed variable with standard deviation σ = since it is the sum of two independent variables with standard deviation σ. Hence by 6c above. HZ = log. π + log e If X is known then Z is still normally distributed, but now the mean = x and the standard deviation is since z = x + y and Y has zero mean and standard deviation σ =. So: HZ X = log π + log e So IZ; X = log. π log π. π = log π = log =.5 bit 9. A symmetric binary communications channel operates with signalling levels of ± volts at the detector in the receiver, and the rms noise level at the detector is.5 volts. The binary symbol rate is kbit/s. a Determine the probability of error on this channel and hence, based on mutual information, calculate the theoretical capacity of this channel for error-free communication. Using notation and results from sections. and.4 of the 3F Random Processes course and section 3.5 of the 3F Information Theory course: Probability of bit error, p e = Q v = Q σ.5 = Q4 = The mutual information between the channel input X and the channel output Y is then: IY ; X = HY HY X = HY H[p e p e ] Now, assuming equiprobable random bits at input and output of the channel: HY = [.5 log log.5] = bit / sym

12 and, from the above p e : H[p e p e ] = [ log log ] = = bit / sym So IY ; X = = bit / sym Hence the error-free capacity of the binary channel is Symbol rate IY ; X = k = kbit/s. b If the binary signalling were replaced by symbols drawn from a continuous process with a Gaussian normal pdf with zero mean and the same mean power at the detector, determine the theoretical capacity of this new channel, assuming the symbol rate remains at ksym/s and the noise level is unchanged. For symbols with a Gaussian pdf, section 4.4 of the course shows that the mutual information of the channel is now given by: IY ; X = hy hn = log σ Y σ N = log + σ X σn since the noise is uncorrelated with the signal, so σ Y = σ X + σ N. For the given channel, σ X /σ N = /.5 = 4. Hence IY ; X = log + 4 =.437 bit / sym and so the theoretical capacity of the Gaussian channel is C G = k..437 = 4.37 kbit/s over twice that of the binary channel c The three capacities are plotted on the next page. There is no loss at low SNR for using binary signalling as long as the output remains continuous. As the SNR increases, all binary signalling methods hit a bit/use ceiling whereas the capacity for continuous signalling continues to grow unbounded.

13 3.5 continuous input, continuous output binary input, binary output binary input, continuous output Capacity [bits/use] SNR [db] 3

Topics. Probability Theory. Perfect Secrecy. Information Theory

Topics. Probability Theory. Perfect Secrecy. Information Theory Topics Probability Theory Perfect Secrecy Information Theory Some Terms (P,C,K,E,D) Computational Security Computational effort required to break cryptosystem Provable Security Relative to another, difficult

More information

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY BURTON ROSENBERG UNIVERSITY OF MIAMI Contents 1. Perfect Secrecy 1 1.1. A Perfectly Secret Cipher 2 1.2. Odds Ratio and Bias 3 1.3. Conditions for Perfect

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3 Outline Computer Science 48 More on Perfect Secrecy, One-Time Pad, Mike Jacobson Department of Computer Science University of Calgary Week 3 2 3 Mike Jacobson (University of Calgary) Computer Science 48

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015 Outline Codes and Cryptography 1 Information Sources and Optimal Codes 2 Building Optimal Codes: Huffman Codes MAMME, Fall 2015 3 Shannon Entropy and Mutual Information PART III Sources Information source:

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2 Introduction to Information Theory B. Škorić, Physical Aspects of Digital Security, Chapter 2 1 Information theory What is it? - formal way of counting information bits Why do we need it? - often used

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Perfectly-Secret Encryption

Perfectly-Secret Encryption Perfectly-Secret Encryption CSE 5351: Introduction to Cryptography Reading assignment: Read Chapter 2 You may sip proofs, but are encouraged to read some of them. 1 Outline Definition of encryption schemes

More information

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Reevana Balmahoon and Ling Cheng School of Electrical and Information Engineering University of the Witwatersrand

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Lecture Notes. Advanced Discrete Structures COT S

Lecture Notes. Advanced Discrete Structures COT S Lecture Notes Advanced Discrete Structures COT 4115.001 S15 2015-01-27 Recap ADFGX Cipher Block Cipher Modes of Operation Hill Cipher Inverting a Matrix (mod n) Encryption: Hill Cipher Example Multiple

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Cryptographic Engineering

Cryptographic Engineering Cryptographic Engineering Clément PERNET M2 Cyber Security, UFR-IM 2 AG, Univ. Grenoble-Alpes ENSIMAG, Grenoble INP Outline Unconditional security of symmetric cryptosystem Probabilities and information

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Cryptography. Lecture 2: Perfect Secrecy and its Limitations. Gil Segev

Cryptography. Lecture 2: Perfect Secrecy and its Limitations. Gil Segev Cryptography Lecture 2: Perfect Secrecy and its Limitations Gil Segev Last Week Symmetric-key encryption (KeyGen, Enc, Dec) Historical ciphers that are completely broken The basic principles of modern

More information

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Computer Science A Cryptography and Data Security. Claude Crépeau

Computer Science A Cryptography and Data Security. Claude Crépeau Computer Science 308-547A Cryptography and Data Security Claude Crépeau These notes are, largely, transcriptions by Anton Stiglic of class notes from the former course Cryptography and Data Security (308-647A)

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Shannon s Theory of Secrecy Systems

Shannon s Theory of Secrecy Systems Shannon s Theory of Secrecy Systems See: C. E. Shannon, Communication Theory of Secrecy Systems, Bell Systems Technical Journal, Vol. 28, pp. 656 715, 1948. c Eli Biham - March 1, 2011 59 Shannon s Theory

More information

Lecture 16 Oct 21, 2014

Lecture 16 Oct 21, 2014 CS 395T: Sublinear Algorithms Fall 24 Prof. Eric Price Lecture 6 Oct 2, 24 Scribe: Chi-Kit Lam Overview In this lecture we will talk about information and compression, which the Huffman coding can achieve

More information

Elliptic Curve Cryptography

Elliptic Curve Cryptography Elliptic Curve Cryptography Elliptic Curves An elliptic curve is a cubic equation of the form: y + axy + by = x 3 + cx + dx + e where a, b, c, d and e are real numbers. A special addition operation is

More information

Channel Coding for Secure Transmissions

Channel Coding for Secure Transmissions Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average

More information

Lecture Notes. Advanced Discrete Structures COT S

Lecture Notes. Advanced Discrete Structures COT S Lecture Notes Advanced Discrete Structures COT 4115.001 S15 2015-01-22 Recap Two methods for attacking the Vigenère cipher Frequency analysis Dot Product Playfair Cipher Classical Cryptosystems - Section

More information

18.310A Final exam practice questions

18.310A Final exam practice questions 18.310A Final exam practice questions This is a collection of practice questions, gathered randomly from previous exams and quizzes. They may not be representative of what will be on the final. In particular,

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

Solution to Midterm Examination

Solution to Midterm Examination YALE UNIVERSITY DEPARTMENT OF COMPUTER SCIENCE CPSC 467a: Cryptography and Computer Security Handout #13 Xueyuan Su November 4, 2008 Instructions: Solution to Midterm Examination This is a closed book

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE General e Image Coder Structure Motion Video x(s 1,s 2,t) or x(s 1,s 2 ) Natural Image Sampling A form of data compression; usually lossless, but can be lossy Redundancy Removal Lossless compression: predictive

More information

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST Information Theory Lecture 5 Entropy rate and Markov sources STEFAN HÖST Universal Source Coding Huffman coding is optimal, what is the problem? In the previous coding schemes (Huffman and Shannon-Fano)it

More information

Outline. CPSC 418/MATH 318 Introduction to Cryptography. Information Theory. Partial Information. Perfect Secrecy, One-Time Pad

Outline. CPSC 418/MATH 318 Introduction to Cryptography. Information Theory. Partial Information. Perfect Secrecy, One-Time Pad Outline CPSC 418/MATH 318 Introduction to Cryptography, One-Time Pad Renate Scheidler Department of Mathematics & Statistics Department of Computer Science University of Calgary Based in part on slides

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

CHAPTER 3. P (B j A i ) P (B j ) =log 2. j=1

CHAPTER 3. P (B j A i ) P (B j ) =log 2. j=1 CHAPTER 3 Problem 3. : Also : Hence : I(B j ; A i ) = log P (B j A i ) P (B j ) 4 P (B j )= P (B j,a i )= i= 3 P (A i )= P (B j,a i )= j= =log P (B j,a i ) P (B j )P (A i ).3, j=.7, j=.4, j=3.3, i=.7,

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly. (b) We note that the above capacity is the same to the capacity of the binary symmetric channel. Indeed, if we considerthe grouping of the output symbols into a = {y 1,y 2 } and b = {y 3,y 4 } we get a

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals Information Sources Professor A. Manikas Imperial College London EE303 - Communication Systems An Overview of Fundamentals Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct. 2011 1

More information

Univ.-Prof. Dr. rer. nat. Rudolf Mathar. Written Examination. Cryptography. Tuesday, August 29, 2017, 01:30 p.m.

Univ.-Prof. Dr. rer. nat. Rudolf Mathar. Written Examination. Cryptography. Tuesday, August 29, 2017, 01:30 p.m. Cryptography Univ.-Prof. Dr. rer. nat. Rudolf Mathar 1 2 3 4 15 15 15 15 60 Written Examination Cryptography Tuesday, August 29, 2017, 01:30 p.m. Name: Matr.-No.: Field of study: Please pay attention to

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

2018/5/3. YU Xiangyu

2018/5/3. YU Xiangyu 2018/5/3 YU Xiangyu yuxy@scut.edu.cn Entropy Huffman Code Entropy of Discrete Source Definition of entropy: If an information source X can generate n different messages x 1, x 2,, x i,, x n, then the

More information

Information-Theoretic Security: an overview

Information-Theoretic Security: an overview Information-Theoretic Security: an overview Rui A Costa 1 Relatório para a disciplina de Seminário, do Mestrado em Informática da Faculdade de Ciências da Universidade do Porto, sob a orientação do Prof

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

THE UNIVERSITY OF CALGARY FACULTY OF SCIENCE DEPARTMENT OF COMPUTER SCIENCE DEPARTMENT OF MATHEMATICS & STATISTICS MIDTERM EXAMINATION 1 FALL 2018

THE UNIVERSITY OF CALGARY FACULTY OF SCIENCE DEPARTMENT OF COMPUTER SCIENCE DEPARTMENT OF MATHEMATICS & STATISTICS MIDTERM EXAMINATION 1 FALL 2018 THE UNIVERSITY OF CALGARY FACULTY OF SCIENCE DEPARTMENT OF COMPUTER SCIENCE DEPARTMENT OF MATHEMATICS & STATISTICS MIDTERM EXAMINATION 1 FALL 2018 CPSC 418/MATH 318 L01 October 17, 2018 Time: 50 minutes

More information

MATH3302 Cryptography Problem Set 2

MATH3302 Cryptography Problem Set 2 MATH3302 Cryptography Problem Set 2 These questions are based on the material in Section 4: Shannon s Theory, Section 5: Modern Cryptography, Section 6: The Data Encryption Standard, Section 7: International

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

The simple ideal cipher system

The simple ideal cipher system The simple ideal cipher system Boris Ryabko February 19, 2001 1 Prof. and Head of Department of appl. math and cybernetics Siberian State University of Telecommunication and Computer Science Head of Laboratory

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Information-theoretic Secrecy A Cryptographic Perspective

Information-theoretic Secrecy A Cryptographic Perspective Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak 4. Quantization and Data Compression ECE 32 Spring 22 Purdue University, School of ECE Prof. What is data compression? Reducing the file size without compromising the quality of the data stored in the

More information

Solution of Exercise Sheet 6

Solution of Exercise Sheet 6 Foundations of Cybersecurity (Winter 16/17) Prof. Dr. Michael Backes CISPA / Saarland University saarland university computer science Solution of Exercise Sheet 6 1 Perfect Secrecy Answer the following

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Lecture Note 3 Date:

Lecture Note 3 Date: P.Lafourcade Lecture Note 3 Date: 28.09.2009 Security models 1st Semester 2007/2008 ROUAULT Boris GABIAM Amanda ARNEDO Pedro 1 Contents 1 Perfect Encryption 3 1.1 Notations....................................

More information

Capacity of AWGN channels

Capacity of AWGN channels Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

3F1 Information Theory, Lecture 1

3F1 Information Theory, Lecture 1 3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:

More information

Lecture 8 - Cryptography and Information Theory

Lecture 8 - Cryptography and Information Theory Lecture 8 - Cryptography and Information Theory Jan Bouda FI MU April 22, 2010 Jan Bouda (FI MU) Lecture 8 - Cryptography and Information Theory April 22, 2010 1 / 25 Part I Cryptosystem Jan Bouda (FI

More information

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Please submit the solutions on Gradescope. Some definitions that may be useful: EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Definition 1: A sequence of random variables X

More information

Cryptography and Network Security Prof. D. Mukhopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur

Cryptography and Network Security Prof. D. Mukhopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur Cryptography and Network Security Prof. D. Mukhopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur Module No. # 01 Lecture No. # 08 Shannon s Theory (Contd.)

More information

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction to Information Theory By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction [B.P. Lathi] Almost in all the means of communication, none produces error-free communication.

More information

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road 517583 QUESTION BANK (DESCRIPTIVE) Subject with Code : CODING THEORY & TECHNIQUES(16EC3810) Course & Branch: M.Tech - DECS

More information

Optimum Binary-Constrained Homophonic Coding

Optimum Binary-Constrained Homophonic Coding Optimum Binary-Constrained Homophonic Coding Valdemar C. da Rocha Jr. and Cecilio Pimentel Communications Research Group - CODEC Department of Electronics and Systems, P.O. Box 7800 Federal University

More information

Digital communication system. Shannon s separation principle

Digital communication system. Shannon s separation principle Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information