MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Size: px
Start display at page:

Download "MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)"

Transcription

1 MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007) Information rate R is represented as the average number of bits of information per second. R= rh information bits/second Where H is entropy r is rate at which messages are generated. 3. Define entropy? (AUC MAY 2007) (AUC DEC 2007)(AUC MAY 2011) Page 1

2 The entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X. Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted. Between these two extremes, information can be quantified as follows. If is the set of all messages{x1,...,xn} that X could be, and p(x) is the probability of X given some, then the entropy of X is defined: (Here, I(x) is the self-information, which is the entropy contribution of an individual message, and is the expected value.) An important property of entropy is that it is maximized when all the messages in the message space are equiprobable p(x) = 1 / n, i.e., most unpredictable in which case H(X) = logn. The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2: 4. Define mutual information? (AUC MAY 2008) The mutual information is defined as the amount of information transferred when x j is transmitted and y k is received. It is represented by I(x j, y k ) and given as, I(x j, y k ) = log 2 [p(x j /y k ) / p(x j )] p(x j /y k ) conditional probability that x j was transmitted and y k is received. p(x j ) probability of symbol x j for transmission 5. Find the entropy Symbol S0 S1 S2 S3 S4 probability (AUC DEC 2008) 6. State Shannon first theorem? (AUC DEC 2008) Source Coding Theorem (Shannon'sfirst theorem) The theorem can be stated as follows: Given a discrete memoryless source of entropy H(S), the average code-wordlength L for any distortionless source coding is bounded asl ³ H(S) This theorem provides the mathematical tool for assessing data compaction, i.e.lossless data compression, of datagenerated by a discrete memorylesssource. The entropy of a source is a function ofthe probabilities of the source symbols that constitute the alphabet of thesource. Entropy of Discrete Memoryless SourceAssume that the source output ismodeled as a discrete random variable,s, which takes on symbols from a fixedfinite alphabet

3 The entropy is a measure of the average information content per source symbol. The source coding theorem is also known as the "noiseless coding theorem" in the sense that it establishes the condition for error-free encoding to be possible. 7. State channel capacity theorem. (AUC MAY2008) Communications over a channel such as an ethernet cable are the primary motivation of information theory. As anyone who's ever used a telephone (mobile or landline) knows, however, such channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality. How much information can one hope to communicate over a noisy (or otherwise imperfect) channel? Consider the communications process over a discrete channel. A simple model of the process is shown below: Communication channel Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let p(y x) be the conditional probability distribution function of Y given X. We will consider p(y x) to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of f(x), the 8. marginal distribution of messages we choose to send over the channel. Under these constraints, we would like to maximize the rate of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:

4 (where R is usually bits per symbol). For any information rate R < C and coding error ε > 0, for large enough N, there exists a code of length N and rate R and a decoding algorithm, such that the maximal probability of block error is ε; that is, it is always possible to transmit with arbitrarily small block error. In addition, for any rate R > C, it is impossible to transmit with arbitrarily small block error. Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity. 8. State channel coding theorem. (AUC MAY 2007) 9. What is channel capacity of binary synchronous channel with error probability of 0.2? (AUC DEC 2007) 10. Calculate the entropy of the source with a symbol set containing 64symbols each with a probability p i =1/64? (AUC APR 2008) 11. Compare Shannon and Huffman coding. (AUC MAY 2009) Shannon's Theorem gives an upper bound to the capacity of a link, in bits per second (bps), as a function of the available bandwidth and the signal-to-noise ratio of the link. 12. The Theorem can be stated as: = B * log2(1+ S/N) where C is the achievable channel capacity, B is the bandwidth of the line, S is the average signal power and N is the average noise power. 13. The signal-to-noise ratio (S/N) is usually expressed in decibels (db) given by the formula: 10 * log10(s/n) For example a signal-to-noise ratio of 1000 is commonly expressed as 10 * log10(1000) = 30 db. 9. Determine differential entropy.. (AUC MAY 2010)

5 14. Define Rate Bandwidth and Bandwidth efficiency. (AUC DEC 2010) Bandwidth efficiency is the ratio of the dat rate in bits per second to the efficicny utilized channel bandwidth Ρ =Rb/B Where Rb= data rate 15. A source generates 3 messages with probability 0.5, 0.25, Calculate source entropy. (AUC DEC 2010) (AUC MAY 2010) 16. Differentiate between lossless and lossy coding (AUC MAY 2011) lossless coding lossy coding Lossless compression schemes are reversib In lossy compression, sccept some loss of so that the original data can be reconstructe data in order to achieve higher compression Lossless data compression will always to fail Lossy method can produce a much smoker f to compress some files than lossless method 17. State Shannon s channel capacity theorem, for a power and band limited channel. (AUC DEC 2011) Refer question No What is information theory? Information theory deals with the mathematical modelling and analysis of a communication system rather than with physical sources and physical channels. 19. What is discrete memory less source? The symbols emitted by the source during successive signalling intervals are statistically independent. That source is called discrete memory less source. Here memoryless, means that the symbol emitted any time is independent of previous choices. 20. What is amount of information?

6 The amount of information gained after observing the event S=S K, which occurs with probability P K, as the logarithmic function. Amount of information I k = log 2 (1/P k ) Unit of information is bit. 21. What is information rate? Information rate R is represented as the average number of bits of information per second. R= rh information bits/second Where H is entropy r is rate at which messages are generated. 22. What is meant by Source encoding? The efficient representation of data generated by a discrete source. This process is called Source coding. The device that performs the representation is called a source encoder. 23. Name the two source coding techniques. Shannon-Fano coding Huffman coding. 24. Write the expression for code efficiency. _ η = H / N _ H Entropy; N average number of bits/message 25. What is channel redundancy? Redundancy is given as, Redundancy (γ) =1- code efficiency Redundancy (γ) =1-η The redundancy should be as low as possible. 26. Write about data compaction? For efficient signal transmission, the redundant information should be removed from the signal prior to transmission. This operation with no loss of information is ordinarily performed on a signal in digital form. This refers to Data compaction (or) Lossless data. 27. Write about channel capacity? The channel capacity of the discrete memoryless channel is given as maximum average mutual information. The maximization is taken with respect to input probabilities P(x i ) C = Max I(X ; Y) {P(x j )} 28. Define channel efficiency and give its mathematical expression. The transmission efficiency of channel efficiency is defined as the ratio of actual transinformation to maximum transinformation. η = I(X;Y) / Max I(X;Y) η = I(X;Y) / C C Channel capacity. 29. Define redundancy of the channel and give its mathematical expression. The redundancy of the channel is defined as ratio of the difference in actual and maximum transinformation to maximum transinformation. It is denoted as γ γ = 1 η γ = [C - I(X;Y)] / C

7 30. What is discrete memoryless channel? Ans. A discrete memoryless channel is a statistical model with an input X and an output Y that is a noisy version of X; both X and Y are random variables. The channel is said to be discrete when both X and Y are discrete. The channel is said to be memoryless when the current output symbol depends only on the current input symbol and not any of the previous one. 31. What is mutual information? The mutual information is defined as the amount of information transferred when x j is transmitted and y k is received. It is represented by I(x j, y k ) and given as, I(x j, y k ) = log 2 [p(x j /y k ) / p(x j )] p(x j /y k ) conditional probability that x j was transmitted and y k is received. p(x j ) probability of symbol x j for transmission. 32. Explain Shannon-Fano coding. An efficient code can be obtained by the following simple procedure, known as Shannon- Fano algorithm. 1. List the source symbols in order of decreasing probability. 2. Partition the set into two sets that are as close to equiprobable as possible, and sign 3. to the upper set and 1 to the lower set. 3. Continue this process, each time partitioning the sets with as nearly equal probabilities as possible until further partitioning is not possible. 33. State the properties of mutual information. I(X;Y) = I(Y;X) I(X,Y) 0 I(X;Y) = H(Y)-H(Y/X) I(X; Y) = H(X) +H(Y)-H(X; Y). 34. Give the relation between the different entropies. H(X; Y) = H(X) +H(Y/X) = H(Y) +H(X/Y) H(X) - entropy of the source(y/x), H(X/Y)-conditional entropy H(Y)-entropy of destination H(X, Y) - Joint entropy of the source and destination. 35. What is channel diagram and channel matrix? The transition probability diagram of the channel is called the channel diagram and its matrix representation is called the channel matrix. 36. What is uncertainty? Explain the difference between uncertainty and information? The words uncertainty, surprise and information are all related to each other. Before an event occurs, there is an uncertainty, when the event occurs there is an amount of surprise and after the occurrence of an event there is a gain of information. Consider the source which emits the discrete symbols randomly from the set of fixed alphabet i.e. X={x 0, x 1, x 2,. x K-1 } The various symbols in X have probabilities of p 0, p 1, p 2 etc, which can be written as, P(X=x K ) =P K K=0, 1, 2 K-1 The set of probabilities satisfy the following condition,

8 The idea of information is related to Uncertainty or Surprise. Considering the emission of symbol X=x K from the source. If the probability of x K is P K =0, then such a symbol is impossible. Similarly when probability P K =1, then such symbol is sure. In both cases there is no Surprise and hence no information is produced when symbol x K is emitted. As the probability P K is low, there is more surprise or uncertainty. Before the event X=x K is emitted, there is an amount of uncertainty. When the symbol X=x K occurs, there is an amount of surprise. After the occurrence of the symbol X=x K there is the gain in amount of information. The essence of which may be viewed as the resolution of uncertainty. 37. Explain Huffman coding. Huffman coding results an optimum code. Thus it is the code that has the highest efficiency. The Procedure is as follows, a. List the source symbols in order of decreasing probability. b. Combine the probabilities of the two symbols having the lowest probabilities, and reorder the resulted probabilities. This step is called reduction. The same procedure is repeated until there are two ordered probabilities remaining. c. Start encoding with the last reduction which consist of exactly two ordered probabilities. Assign 0 as the first digit in the code words for all the source symbols associated with the first probability and assign 1 to the second probability. d. Now go back and assign 0 and 1 to the second digit for the two probabilities that were combined in the previous reduction step, retaining all assignments made in step e. Keep regressing this way until the first column is reached. 38. State Shannon theorem. Given a source of M equally likely messages, with M>>1, which is generating information at a rate R. Given a channel with a channel capacity C. Then if R < C, there exists a coding technique such that the output of the source may be transmitted over the channel with a probability of error of receiving the message which may be made arbitrarily small. 39. Write the mathematical expression of channel capacity. C = B log 2 (1 + S/N) B Bandwidth; S average transmitted power; N average noise power. Shannon s theorem: A given communication system has a maximum rate of information C known as the channel capacity. o If the information rate R is less than C, then one can approach arbitrarily small error probabilities by using intelligent coding techniques. o To get lower error probabilities, the encoder has to work on longer blocks of signal data. This entails longer delays and higher computational requirements. Thus, if R _ C then transmission may be accomplished without error in the presence of noise. Unfortunately, Shannon s theorem is not a constructive proof it merely states that such a coding method exists. The proof can therefore not be used to develop a coding method that reaches the channel capacity.

9 The negation of this theorem is also true: if R > C, then errors cannot be avoided regardless of the coding technique used. 40. Calculate the Entropy of the Source with symbol probabilities 0.6, 0.3 and 0.1. (AUC DEC 2011) H= = 1.29 bits/symbol 41. A source generate three message with probability 0.5, 0.25, calculate H(x) (AUC MAY 2012) H(x) =1.2 bits/msg 42. State the advantages of LempiZiv coding(auc MAY 2012) Receiver does not require prior knowledge of the coding table constructed by the transmitter Eliminates the need of large buffer to store the received code words until such time as the decoding dictionary is complete enough to code them Synchronous transmission 43. Define mutual information Formally, the mutual information of two discrete random variables X and Y can be defined as: where p(x,y) is the joint probability distribution function of X and Y, and p1(x) and p2(y) are the marginal probability distribution functions of X and Y respectively. In the case of a continuous function, summation is matched with a definite double integral: Where p(x, y) is now the joint probability density function of X and Y, and p1(x) p1(x) and p2(y) are the marginal probability density functions of X and Y respectively. 44. Define Rate distortion theory Rate distortion theory: Rate distortion theory is a major branch of information theory which provides the theoretical foundations for lossy data compression; it addresses the problem of determining the minimal amount of entropy (or information) R that should be communicated over a channel, so that the source (input signal) can be approximately reconstructed at the receiver (output signal) without exceeding a

10 given distortion D. The functions that relate the rate and distortion are found as the solution of the following minimization problem. In the above equation, I(X,Y) is the Mutual information. 46.An event has six possible outcomes with probabilities {1/2, 1/4, 1/8, 1/16, 1/32, 1/32}. Find the Entropy of the source S (AUC NOV/DEC 2010) 47. What is entropy? (AUC APR/MAY 2011) The entropy of a source is defined as the source which in produces average information per individual message or symbol in particular interval. It is also called entropy 48. What is information rate? (AUC NOV/DEC 2011) Information rate R is represented as the average number of bits of information per second. R= rh information bits/second Where H is entropy r is rate at which messages are generated. PART-B 1. (i) Define Mutual information. Find the relation between the mutual information and the joint entropy of the channel input and channel output. 2. Consider a discrete memory less channel with input alphabet X, output alphabet Y and transition probabilities p(y k / X j ). Find the mutual information of channel to obtain the channel capacity.

11 3. i)state and prove the properties of mutual information. Define Mutual information. State any two properties(4)

12 (ii) What are the implications of information capacity theorem?(auc NOV 2006)

13 (ii) Give the advantage and disadvantage of channel coding in detail. 4. Derive the channel capacity theorem. Shannon's Theorem Shannon's Theorem gives an upper bound to the capacity of a link, in bits per second (bps), as a function of the available bandwidth and the signal-to-noise ratio of the link. The Theorem can be stated as: C = B * log2(1+ S/N) where C is the achievable channel capacity, B is the bandwidth of the line, S is the average signal power and N is the average noise power. The signal-to-noise ratio (S/N) is usually expressed in decibels (db) given by the formula: 10 * log10(s/n) so for example a signal-to-noise ratio of 1000 is commonly expressed as 10 * log10(1000) = 30 db. Here is a graph showing the relationship between C/B and S/N (in db): (iii)discuss the implication of the information capacity theorem. (AUC MAY2007) 5. (i)derive the expression for channel capacity of a binary symmetric channel. Discuss in detail binary symmetric channel and binary erasure channel. ) Derive the channel

14 capacity for Binary Symmetric channel.(6) (ii) Derive the channel capacity for band limited, power limited Gaussian Channel. (10) (AUC DEC 2010) Also find the channel capacity of binary symmetric channel. (AUC NOV 2008)

15 6. Give the (Shanon- Hartley) information capacity theorem (AUC MAY 2009) It can be stated as follows: The information capacity of a continuous channel of bandwidth B Hz, perturbed by additive white Gaussian noise of power spectral density 2N0 and limited in bandwidth to B, is given by where P is he average transmitted power. This theorem implies that, for given average transmitted power P and channel bandwidth B, we can transmit information at the rate C bits per second, with arbitrarily small probability of error by employing sufficiently complex encoding systems. 7. Find the code words for five symbols of t he alphabet of a discrete memory- less source with probability {0.4, 0.2, 0.2, 0.1, 0.1}, using Huffman coding and determine the source entropy and average code word length. (10) (AUC NOV 2006) Consider a sequence of letters of English alphabet with their probabilities of occurrence as given. Letter A I L M N O P Y Probability Compute two different Huffman codes for this alphabet. Also for each of the two codes, find the average code-word length and variance of average code-word length over ensemble of letters. (AUC NOV 2008) A discrete memory less channel has the following alphabet with probability of occurrence.

16 Symbol S 0 S 1 S 2 S 3 S 4 S 5 S 6 Probability Generate the Huffman coding. Find average Coded Length, entropy and η. (AUC NOV 2007) Encode the following source using Huffman Coding.X = { x 1, x 2, x 3, x 4, x 5 } P(X) = {0.2, 0.02, 0.1, 0.38, 0.3 } (AUC MAY 2006) 10.(i) Derive the channel capacityfor Binary Symmetric channel.(6) (ii) Derive the (10) (AUC DEC 2010) (AUC MAY 2011) ii)the channel transition matrix [ ]. Draw the channel diagram and determine the probabilities associated with output assuming equipropable inputs. (AUC MAY 2011) 11.Justify the need for an efficient source encoding process in order to increase the average transmitted information per bit, if the source emitted symbol are not equally likely with example.consider a discrete memory less source for your justification(auc DEC 2011)

17 12. A Database Management System (DMS) has following alphabet with probability of occurrence as shown below Symbol S0 S1 S2 S3 S4 S5 S6 probability (AUC MAY 2012) 13. Derive Shannon-Hartley theorem for the channel capacity of a continuous channel having an average limitation and perturbed by an additive band limited white Gaussian noise 1. Brief the properties of entropy.

18

19 14. Five symbols of the alphabet of discrete memory less source and their probabilities are given below.s={s0,s1,s2,s3,s4} P (S)=(0.4,0.2,0.2,0,1,0.1)Code the symbols using Huffman coding. (12) (AUC NOV/DEC 2010) Using Huffman code I, encode the following symbols.s = [0.3, 0.2, 0.25, 0.12, 0.05, 0.08,]Calculate average code length, entropy of the source, code efficiency, redundancy

20 Efficieny =96% 15. Write in detail the procedure of Shannon-Fano coding scheme. Eight possible messages m1,m2,m3,m4,m5,m6,m7 and m8 from a source and their P(m1)=0.5,P(m2)=0.15,P(m3)=0.15,P(m4)=0.08,P(m5)=0.08,P(m6)=0.02,P(m7)=0.01,P( m8)=0.01.construct the Shannon fano coding for each of these message in order to increase the average information per bit. Find the coding efficiency. (AUC DEC 2011) In Shannon Fano coding, the symbols are arranged in order from most probable to least probable, and then divided into two sets whose total probabilities are as close as possible to being equal. All symbols then have the first digits of their codes assigned; symbols in the first set receive "0" and symbols in the second set receive "1". As long as any sets with more than one member remain, the same process is repeated on those sets, to determine successive digits of their codes. When a set has been reduced to one symbol, of course, this means the symbol's code is complete and will not form the prefix of any other symbol's code.

21 The algorithm works, and it produces fairly efficient variable-length encodings; when the two smaller sets produced by a partitioning are in fact of equal probability, the one bit of information used to distinguish them is used most efficiently. Unfortunately, Shannon Fano does not always produce optimal prefix codes. For this reason, Shannon Fano is almost never used; Huffman coding is almost as computationally simple and produces prefix codes that always achieve the lowest expected code word length. Shannon Fano coding is used in the IMPLODE compression method, which is part of the ZIP file format, where it is desired to apply a simple algorithm with high performance and minimum requirements for programming. Shannon-Fano Algorithm: A Shannon Fano tree is built according to a specification designed to define an effective code table. The actual algorithm is simple: 1. For a given list of symbols, develop a corresponding list of probabilities or frequency counts so that each symbol s relative frequency of occurrence is known. 2. Sort the lists of symbols according to frequency, with the most frequently occurring symbols at the left and the least common at the right. 3. Divide the list into two parts, with the total frequency counts of the left part being as close to the total of the right as possible. 4. The left part of the list is assigned the binary digit 0, and the right part is assigned the digit 1. This means that the codes for the symbols in the first part will all start with 0, and the codes in the second part will all start with Recursively apply the steps 3 and 4 to each of the two halves, subdividing groups and adding bits to the codes until each symbol has become a corresponding code leaf on the tree. Example 1: The source of information A generates the symbols {A0, A1, A2, A3 and A4} with the corresponding probabilities {0.4, 0.3, 0.15,

22 0.1 and 0.05}. Encoding the source symbols using binary encoder and Shannon-Fano encoder gives: 16. Explain the concept of noiseless coding theorem and state its significance Give the (Shanon- Hartley) information capacity theorem and Discuss the implication of the same in detail. State and explain Shannon theorem on channel capacity(12) (ii) Discuss the source coding theorem. (6)(AUC MAY 2010) (i) Discuss the source coding theorem. Elaborate the channel coding theorem with an example Source Coding Theorem (Shannon's first theorem) The theorem can be stated as follows: Given a discrete memoryless source of entropy H(S), the average code-word length L for any distortionless source coding is bounded as L H(S) This theorem provides the mathematical

23 tool for assessing data compaction, i.e. lossless data compression, of data generated by a discrete memoryless source. The entropy of a source is a function of the probabilities of the source symbols that constitute the alphabet of the source. Entropy of Discrete Memoryless Source Assume that the source output is modeled as a discrete random variable, S, which takes on symbols from a fixed finite alphabet With probabilities after observing the event k S s as the The entropy is a measure of the average information content per source symbol. The source coding theorem is also known as the "noiseless coding theorem" in the sense that it establishes the condition for error-free encoding to be possible Channel Coding Theorem (Shannon's 2nd theorem) The channel coding theorem for a discrete memoryless channel is stated in two parts as follows: (a)let a discrete memoryless source with an alphabet S have entropy H(S) and produce symbols once every S T seconds. Let a discrete memoryless channel have capacity C and be used once every C T seconds. Then if logarithmic function

24 There exists a coding scheme for which the source output can be transmitted over the channel and be reconstructed with an arbitrarily small probability of error. Information Capacity Theorem(also known as Shannon-Hartley law or Shannon's 3rd theorem) It can be stated as follows: The information capacity of a continuous channel of bandwidth B Hz, perturbed by additive white Gaussian noise of power spectral density 2N0 and limited in bandwidth to B, is given by where P is he average transmittedpower. This theorem implies that, for given average transmitted power P and channel bandwidth B, we can transmit information at the rate C bits per second, with arbitrarily small probability of error by employing sufficiently complex encoding systems Explain in detail about data compaction codings? Discuss the data compaction.(auc MAY 2007) (i) An Analog signal i s band limited to `B' Hz and sampled at Nyquist rate.the sampled signals are quantized into 4 levels. Each level represents one message. The probability of occurrence of the four messages are p1=p3=1/8; p2=p4=3/8. Find out information rate of the source. (6) (ii) Five source messages are probable to appear as m1 = 0:4, m2 = 0:15, m3 = 0:15, m4 = 0:15, m5 = 0:15. Find coding efficiency for (1) Shannon- Fano coding, (2) Huffman coding. (10) (AUC DEC 2010)

25

26

27

28

29

30 Explain in detail about discrete memory less channel

31 Explain in detail about channel capacity

32 18. Write short notes on differential entropy.

33

34

35

36 19. Explain about channel capacity?

37 Channel capacity

38 20. Explain the bandwidth signal to noise radio trade-off for this theorem(auc MAY 2012) (ii) Discuss about rate distortion theory. (6)(AUC MAY 2010) 21.Explain about Data compression. Discuss the various technique used for compression of information (AUC MAY 2009)

39

40 22.Explain about rate distortion theory

41 23.(i) Derive the channel capacity of a continuous band limited white Gaussian noise channel. (10)(ii) Derive the expression for channel capacity of a continuous channel. Find also the expression for channel capacity of a continuous channel of infinite bandwidth comment on the results. (AUC MAY 2006)

42

43

44

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road 517583 QUESTION BANK (DESCRIPTIVE) Subject with Code : CODING THEORY & TECHNIQUES(16EC3810) Course & Branch: M.Tech - DECS

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Entropy as a measure of surprise

Entropy as a measure of surprise Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction to Information Theory By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction [B.P. Lathi] Almost in all the means of communication, none produces error-free communication.

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free

More information

Compression and Coding

Compression and Coding Compression and Coding Theory and Applications Part 1: Fundamentals Gloria Menegaz 1 Transmitter (Encoder) What is the problem? Receiver (Decoder) Transformation information unit Channel Ordering (significance)

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Digital communication system. Shannon s separation principle

Digital communication system. Shannon s separation principle Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2013, 29 November 2013 Memoryless Sources Arithmetic Coding Sources with Memory Markov Example 2 / 21 Encoding the output

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Multimedia Communications. Mathematical Preliminaries for Lossless Compression Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Information and Entropy

Information and Entropy Information and Entropy Shannon s Separation Principle Source Coding Principles Entropy Variable Length Codes Huffman Codes Joint Sources Arithmetic Codes Adaptive Codes Thomas Wiegand: Digital Image Communication

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur Module ntroduction to Digital Communications and nformation Theory Lesson 3 nformation Theoretic Approach to Digital Communications After reading this lesson, you will learn about Scope of nformation Theory

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner

More information

3F1 Information Theory, Lecture 1

3F1 Information Theory, Lecture 1 3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Basic information theory

Basic information theory Basic information theory Communication system performance is limited by Available signal power Background noise Bandwidth limits. Can we postulate an ideal system based on physical principles, against

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Basic Principles of Video Coding

Basic Principles of Video Coding Basic Principles of Video Coding Introduction Categories of Video Coding Schemes Information Theory Overview of Video Coding Techniques Predictive coding Transform coding Quantization Entropy coding Motion

More information

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE General e Image Coder Structure Motion Video x(s 1,s 2,t) or x(s 1,s 2 ) Natural Image Sampling A form of data compression; usually lossless, but can be lossy Redundancy Removal Lossless compression: predictive

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019 Lecture 11: Information theory DANIEL WELLER THURSDAY, FEBRUARY 21, 2019 Agenda Information and probability Entropy and coding Mutual information and capacity Both images contain the same fraction of black

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2005 Issued: March 7, 2005

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

Bioinformatics: Biology X

Bioinformatics: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA Model Building/Checking, Reverse Engineering, Causality Outline 1 Bayesian Interpretation of Probabilities 2 Where (or of what)

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70 Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks

More information

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding Ch 0 Introduction 0.1 Overview of Information Theory and Coding Overview The information theory was founded by Shannon in 1948. This theory is for transmission (communication system) or recording (storage

More information

Multimedia. Multimedia Data Compression (Lossless Compression Algorithms)

Multimedia. Multimedia Data Compression (Lossless Compression Algorithms) Course Code 005636 (Fall 2017) Multimedia Multimedia Data Compression (Lossless Compression Algorithms) Prof. S. M. Riazul Islam, Dept. of Computer Engineering, Sejong University, Korea E-mail: riaz@sejong.ac.kr

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Computational Systems Biology: Biology X

Computational Systems Biology: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA L#8:(November-08-2010) Cancer and Signals Outline 1 Bayesian Interpretation of Probabilities Information Theory Outline Bayesian

More information

Shannon s A Mathematical Theory of Communication

Shannon s A Mathematical Theory of Communication Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

Lecture 11: Continuous-valued signals and differential entropy

Lecture 11: Continuous-valued signals and differential entropy Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Information Theory and Coding

Information Theory and Coding Information Theory and oding Subject ode : 0E55 IA Marks : 5 No. of Lecture Hrs/Week : 04 Exam Hours : 0 Total no. of Lecture Hrs. : 5 Exam Marks : 00 PART - A Unit : Information Theory: Introduction,

More information

Information Theory, Statistics, and Decision Trees

Information Theory, Statistics, and Decision Trees Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010

More information

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman

More information

2018/5/3. YU Xiangyu

2018/5/3. YU Xiangyu 2018/5/3 YU Xiangyu yuxy@scut.edu.cn Entropy Huffman Code Entropy of Discrete Source Definition of entropy: If an information source X can generate n different messages x 1, x 2,, x i,, x n, then the

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Information Theory and Statistics Lecture 2: Source coding

Information Theory and Statistics Lecture 2: Source coding Information Theory and Statistics Lecture 2: Source coding Łukasz Dębowski ldebowsk@ipipan.waw.pl Ph. D. Programme 2013/2014 Injections and codes Definition (injection) Function f is called an injection

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

Information in Biology

Information in Biology Lecture 3: Information in Biology Tsvi Tlusty, tsvi@unist.ac.kr Living information is carried by molecular channels Living systems I. Self-replicating information processors Environment II. III. Evolve

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Information Theory and Coding Techniques

Information Theory and Coding Techniques Information Theory and Coding Techniques Lecture 1.2: Introduction and Course Outlines Information Theory 1 Information Theory and Coding Techniques Prof. Ja-Ling Wu Department of Computer Science and

More information

Uncertainity, Information, and Entropy

Uncertainity, Information, and Entropy Uncertainity, Information, and Entropy Probabilistic experiment involves the observation of the output emitted by a discrete source during every unit of time. The source output is modeled as a discrete

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Information Theory (Information Theory by J. V. Stone, 2015)

Information Theory (Information Theory by J. V. Stone, 2015) Information Theory (Information Theory by J. V. Stone, 2015) Claude Shannon (1916 2001) Shannon, C. (1948). A mathematical theory of communication. Bell System Technical Journal, 27:379 423. A mathematical

More information

Information in Biology

Information in Biology Information in Biology CRI - Centre de Recherches Interdisciplinaires, Paris May 2012 Information processing is an essential part of Life. Thinking about it in quantitative terms may is useful. 1 Living

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1

More information

Shannon's Theory of Communication

Shannon's Theory of Communication Shannon's Theory of Communication An operational introduction 5 September 2014, Introduction to Information Systems Giovanni Sileno g.sileno@uva.nl Leibniz Center for Law University of Amsterdam Fundamental

More information

Information Theory with Applications, Math6397 Lecture Notes from September 30, 2014 taken by Ilknur Telkes

Information Theory with Applications, Math6397 Lecture Notes from September 30, 2014 taken by Ilknur Telkes Information Theory with Applications, Math6397 Lecture Notes from September 3, 24 taken by Ilknur Telkes Last Time Kraft inequality (sep.or) prefix code Shannon Fano code Bound for average code-word length

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

Lecture 1. Introduction

Lecture 1. Introduction Lecture 1. Introduction What is the course about? Logistics Questionnaire Dr. Yao Xie, ECE587, Information Theory, Duke University What is information? Dr. Yao Xie, ECE587, Information Theory, Duke University

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

L. Yaroslavsky. Fundamentals of Digital Image Processing. Course

L. Yaroslavsky. Fundamentals of Digital Image Processing. Course L. Yaroslavsky. Fundamentals of Digital Image Processing. Course 0555.330 Lec. 6. Principles of image coding The term image coding or image compression refers to processing image digital data aimed at

More information