I - Information theory basics

Size: px
Start display at page:

Download "I - Information theory basics"

Transcription

1 I - Information theor basics Introduction To communicate, that is, to carr information between two oints, we can emlo analog or digital transmission techniques. In digital communications the message is constituted b sequences of bits. Digital transmissions offer the following advantages:. Robustness to noise and interference that cannot attained with analog sstems (through the use of source and channel coding and a convenient transmission rate in bits/second). 2. Integration of several information sources (analog and digital) in a common format. 3. Securit of the information along the ath between the source and the destination (through the use of encrted messages and sread sectrum techniques). 4. Efficient storage of large amounts of data in otical or magnetic media. 5. Fleibilit in the transmission of information through the communication network b formatting data in ackets (data+origin and destination addresses+acket number). Fig. deicts a oint-to-oint digital communication sstem. The transmission channel is the hsical transmission medium used to connect the source of information (transmitter) to the user (receiver). Different tes of channels can be defined, deending on the art of the sstem we are analzing. Between the modulator outut and the demodulator inut we have a continuous channel modeled, for instance, according to Fig. 2. In this case, the channel is comletel characterized b the noise robabilit densit function w(t). A common channel in Telecommunications is the AWGN (additive white Gaussian noise) where w(t) is additive Gaussian noise with ower sectral densit G w (f) = N 0 /2. If, alternativel, we consider in Fig. the channel encoder outut and the decoder inut, we have a discrete channel that accets smbols i of an inut alhabet X rovided b the channel encoder and ro- FDNunes, IST 203.

2 2 SOURCE ENCODING 2 duces smbols j belonging to an outut alhabet Y. When X and Y contain the same smbols, j is an estimate of the transmitted smbol i. discrete source source encoder channel encoder i modulator (t) transmission channel user source decoder channel decoder j demodulator (t) Figure : Functional block diagram of a oint-to-oint digital communication sstem (t) + (t) w(t) Figure 2: Diagram of the AWGN channel 2 Source encoding Consider that the message consists of a sequence of smbols selected from a finite set, named source alhabet. In general, we can associate a given robabilit with the occurrence of each smbol of the alhabet. Besides, the successive emitted smbols ma be statisticall indeendent or ehibit some te of deendenc between them. In the former case we sa that the source is memorless. The amount of information carried b a given smbol of the alhabet deends on the uncertaint of its occurrence. For instance, given the sentences: a dog bit a man and a man bit a dog, the amount of information is larger in the latter sentence because the robabilit of occurrence of the latter event is smaller (an event with robabilit one corresonds to a null amount of information). Consider a finite alhabet X formed b M smbols { i } M i= and define a message as a sequence of indeendent smbols (n), n = 0,,..., with n denoting time. A robabilit of occurrence i = rob ( i ) eists associated with each smbol. The amount of information corresonding to that smbol is I( i ) = log 2 (bits) i

3 2 SOURCE ENCODING 3 In order to characterize the alhabet we define the average content of information (or entro) of X M M H(X) = i I( i ) = i log 2 i which is eressed in bits/smbol. i= eamle: A source alhabet consists of four smbols with robabilities = /2, 2 = /4, 3 = 4 = /8. The source entro is given b H(X) = 2 log log log 2 8 =.75 bits/smbol i= A roblem that arises is the encoding of each source smbol through a binar code word (using binar smbols 0 and ). Since the wa of encoding each smbol is not unique, this leads to the question of otimizing the encoding rocess in the sense of minimizing the average number of bits (binar smbols) used to transmit the message. A classical source encoding eamle is the Morse code where the letters A..Z, the numbers 0..9 and some unctuation marks are encoded in binar words constituted b dashes and dots. Let L be the average length of the code words, given b M L = i l i, i= where l i is the length (in bits) of the code word associated with the smbol i. It can be roven that the average length of the code words resents a minimum value such that L H(X) in order to allow the discrete memorless source, X, to be encoded and uniquel decoded (without ambiguit), that is, in such a wa that, to each finite sequence of bits, there is at most a corresonding message. A sufficient condition that allows the code to be uniquel decodable and instantaneous (each word is immediatel decoded after its occurrence) is that no code word is the refi of another longer code word (refi code). For instance, the following code

4 2 SOURCE ENCODING 4 smbols code words is ambiguous or not uniquel decodable because the sequence of bits 00 can reresent either the smbols 3 + or the smbol 4. But the net code is decodable without ambiguit (refi code) smbols code words The Huffman rocedure can be used to build uniquel decodable codes: ste. Order the M smbols according to decreasing values of their robabilities. ste 2. Grou that last two smbols, M e M, into an equivalent smbol with robabilit M + M. ste 3. Reeat stes e 2 until one smbol is left. ste 4. Using the tree generated b the revious stes, associate the binar smbols 0 and to each air of branches originated from a given intermediate node. The code word of each message smbol is written (from left to right) as a binar sequence read from the root of the tree (thus, from right to left). eamle: Determine a Huffman code for the following source smbols robabilities, i Solution:

5 2 SOURCE ENCODING 5 A ossible solution is smbols code words corresonding to the Huffman tree i i where Figure 3: Eamle of a Huffman tree The efficienc of the resulting code is defined as bits/smbol and η = H L, H = 0.4 log 2 (/0.4) log 2 (/0.25) log 2 (/0.2) + 0. log 2 (/0.) log 2 (/0.05) = 2.04 L = = 2. bits/word, ielding η = H/L = 97.2%. The Huffman algorithm, roosed in 952, requires a robabilistic source model. This data comression technique was later surassed b the Lemel-Ziv algorithm (invented in 978) which is adative and does not require knowledge of the source distribution model. The Lemel-Ziv algorithm is nowadas the most oular data

6 3 GAUSSIAN CHANNEL CAPACITY 6 comression technique; when alied to english tets is allows a comaction of about 55% whereas the Huffman algorithm allows about 43% comaction. Note that the urose of source encoding is to reduce the source code redundanc and not to rotect against channel errors. This task is assigned to the channel encoding, to be discussed later in this course. 3 Gaussian channel caacit The otimal digital sstem is the one that minimizes the bit error robabilit when certain constraints are imosed to the transmitted energ and channel bandwidth. An issue is the ossibilit of transmitting data without bit errors through a nois channel.this roblem was solved b Claude Shannon in 948, which has shown that, for an AWGN channel, it is ossible to transmit data with a bit error robabilit as small as desired (virtuall tending to zero) rovided that the transmission rate (in bits/second) is smaller than the channel caacit ( C = B log 2 + P ), bits/s N 0 B where B is the channel bandwidth in Hz, P is the average ower of the received signal in watts and P /N 0 B is the recetion signal-to-noise ratio. O channel caacit theorem establishes the theoretical limit that the actual communication sstems can achieve although it does not secif which are the modulation and encoding/decoding techniques to be used to attain that limit. Eamle: Which is the caacit of the AWGN channel, with bandwidth B = 0 KHz when the signal-to-noise ratio is: a) 0 db; b) 20 db. Solution: a) C = 0 4 log 2 2 = 0 Kbits/s b) C = 0 4 log 2 0 = 66.6 Kbits/s Let E b be the average bit energ and R b the transmission rate in bits/second. The Shannon theorem ma be re-written as ( C B = log 2 + E br b N 0 B ).

7 3 GAUSSIAN CHANNEL CAPACITY 7 or, But R b C; thus ( R b B log 2 + E br b N 0 B E b 2Rb/B N 0 R b /B. This inequalit gives us the minimum value of the bit signal-to-noise ratio for transmissions with arbitraril small robabilites. If now we allow the channel bandwidth to increase to infinit, the asmtotic value of the caacit is ), C = lim C = lim B ln ( ) + P N 0 B B B ln 2 But lim n ( + /n) n =, leading to = ( ln 2 lim ln + P ) B B N 0 B C = P/N 0 ln 2 where P = E b /T b = r b E b (r b = /T b is the transmission rate in bits/s). But r b < C, so E b N 0 > ln 2.6 db This value is the absolute minimum for communications with virtuall null robabilities, being named Shannon limit. Eamle: Determine the minimum bit signal-to-noise ratio to transmit with an arbitraril small error robabilit at the rate of kbit/second when the channel bandwidth is a) B = khz, b) B = 00 Hz. Solution: a) R b /B =, E b /N 0 (0 db) b) R b /B = 0, E b /N (20. db)

8 4 DISCRETE MEMORYLESS CHANNEL 8 4 Discrete memorless channel A discrete channel is characterized b an inut alhabet X = { i }, i =,..., M, an outut alhabet Y = { j }, j =,..., N, and a set of conditional robabilities ij, where ij = P ( j i ) reresents the robabilit of receiving the smbol j when smbol i was transmitted (see Fig. 4). It is assumed that the channel does not have memor, that is n P ((),..., (n) (),..., (n)) = P ((i) (i)) i= where (i) and (i) are resectivel the channel inut and outut smbols that occur at the discrete time i with i =,..., n. 2 2 N 2 M M M2 MN N Figure 4: Model of the discrete memorless channel In general we have N ij =, i =,..., M j= that is, the sum of all the transition robabilities using the same inut smbol is equal to one. It is usual to organize the transition robabilities in the so-called channel matri N P = 2 2N..... M MN

9 4 DISCRETE MEMORYLESS CHANNEL 9 For M = N we define the average error robabilit as P (e) = = N N N N P ( i, j ) = P ( i ) P ( j i ) i= i= j= j i N N P ( i ) ij i= j= j i N P ( i )( ii ) i= j= j i whereas the robabilit of receiving correctl the transmitted smbol is N P (c) P (e) = P ( i ) ii i= noiseless channel. We have M = N and the transition robabilities are ij = {, j = i 0, j i Thus, P (e) = 0. useless channel. We have M = N and the outut smbols are indeendent of the inut smbols ij = P ( j i ) = P ( j ), i, j The noiseless channel and the useless channel are the etreme cases of the ossible channel behavior. The outut smbol of the noiseless channel defines uniquel the inut smbol. In the useless channel the received smbol does not give an useful information about the transmitted smbol. smmetric channel. In this channel each row of P contains the same set of values {r j }, j =,..., N and each column contains the same set of values {q i }, i =,..., M. Eamles: /2 /3 /6 P = /6 /2 /3, P = /3 /6 /2 [ /3 /3 /6 ] /6 /6 /6 /3 /3 Using the channel inut and outut alhabets, resectivel X and Y and the channel matri P, we can define the following five entroies. (i) inut entro H(X)

10 4 DISCRETE MEMORYLESS CHANNEL 0 H(X) = M i= ( ) P ( i ) log 2 P ( i ) bit/smbol which measures the average amount of information of each smbol of X. (ii) outut entro H(Y ) H(Y ) = N i= ( ) P ( j ) log 2 P ( j ) bit/smbol which measures the average amount of information of each smbol of Y. (iii) joint entro H(X, Y ) ( ) M N H(X, Y ) = P ( i, j ) log 2 i= j= P ( i, j ) bit/(air of smbols) which measures the average information content of a air of outut and inut channel smbols. (iv) conditional entro H(Y X) ( ) M N H(Y X) = P ( i, j ) log 2 i= j= P ( j i ) bit/smbol which measures the average amount of information required to secif the outut (received) smbol when the inut (transmitted) smbol is known. (v) conditional entro H(X Y ) ( ) M N H(X Y ) = P ( i, j ) log 2 i= j= P ( i j ) bit/smbol which measures the average amount of information required to secif the inut smbol when the outut smbol is known. This conditional entro reresents the average amount of information that is lost in the channel (or equivocation). It can also be conceived as the uncertaint about the channel inut after the observation of the channel outut. Note that for a noiseless channel there is no loss of the information in the channel and we have H(X Y ) = 0, whereas in the useless channel we have H(X Y ) = H(X). In this case, the uncertaint about the transmitted smbol remains unaltered b the observation (recetion) of the outut smbol (all the information was lost in the channel). Using the revious entro definitions and the fact that H(X Y ) H(X) and H(Y X) H(Y ), we obtain H(X, Y ) = H(Y, X) = H(X) + H(Y X) = H(Y ) + H(X Y ) ()

11 5 CAPACITY OF THE DISCRETE MEMORYLESS CHANNEL 5 Caacit of the discrete memorless channel We define the flow of information (or mutual information) between X and Y through the channel as or using () I(X; Y ) H(X) H(X Y ) bit/smbol (2) I(X; Y ) = H(Y ) H(Y X) = H(X) + H(Y ) H(X, Y ) (3) H(X) H(Y) H(X Y) I(X;Y) H(Y X) Figure 5: Relation between the conditional entroies and the mutual information We have I(X; Y ) = H(X) + H(Y ) H(X, Y ) = E = E [ [ ( )] log 2 P (X) + E ( )] P (X, Y ) log 2 P (X)P (Y ) But P ( i, j ) = P ( j i )P ( i ) leading to I(X; Y ) = From (2) and (3) we get also M N i= j= [ ( )] log 2 P (Y ) = M N i= j= ( )] E [log 2 P (X, Y ) ( ) P (i, j ) P ( i, i ) log 2 P ( i )P ( j ) ( ) P (j i ) P ( i, i ) log 2 P ( j ) I(X; Y ) = I(Y ; X) The mutual information I(X; Y ) quantifies the reduction of uncertaint relative to a X given the knowledge of Y (see Fig. 5).

12 6 CAPACITY OF THE BINARY SYMMETRIC CHANNEL 2 The caacit C of a discrete memorless channel is defined as the maimum of mutual information I(X; Y ) that can be transmitted through the channel C ma P () I(X; Y ) bit/transmission Maimization is carried out relative to the robabilities P ( i ) of the inut smbols. 6 Caacit of the binar smmetric channel Consider the binar smmetric channel (BSC) of Fig. 6 and let q P ( ) and r P ( ). The entro H(X) of source X is (see Fig. 7) Figure 6: Binar smmetric channel (BSC) H(q) = q log 2 q + ( q) log 2 q and the entro of source Y is H(Y ) = H(r) with Besides ( ) 2 2 H(Y X) = P ( i, j ) log 2 i= j= P ( j i ) bit/smbol P (, ) = P ( )P ( ) = ( )q P (, 2 ) = P ( 2 )P ( ) = q P ( 2, ) = P ( 2 )P ( 2 ) = ( q) P ( 2, 2 ) = P ( 2 2 )P ( 2 ) = ( )( q)

13 6 CAPACITY OF THE BINARY SYMMETRIC CHANNEL H(q) q Figure 7: Entro of the binar source X resulting in H(Y X) = ( )q log 2 + q log 2 + ( q) log 2 + ( )( q) log 2 = H() Thus, the mutual information of the BSC is given b and the caacit of the BSC is I(X; Y ) = H(Y ) H(Y X) = H(r) H() or taking into account Fig. 7 C = ma{h(r)} H() r bit/transmission C = H() The lot of the BSC caacit versus the transition robabilit is shown in Fig. 8. The situation that leads to the maimum, that is, H(r) =, corresonds to

14 6 CAPACITY OF THE BINARY SYMMETRIC CHANNEL 4 BSC caacit C [bits/transmission] Figure 8: Caacit of the BSC versus the transition robabilit r log 2 r + ( r) log 2 r = which, b insection of Fig. 7, gives r = ( ) = /2. In other words, the maimum of information transmission from the channel inut to the outut, for an value of, occurs when the robabilities of and 2 are equal. The channel caacit is maimum when = 0 or =, since in both cases the channel is noiseless (see Fig. 9). (=0) (=) Figure 9: Noiseless channels that maimize the caacit C For = /2, the channel caacit is zero because the outut smbols are indeendent from the inut smbols as no information can flow through the channel. We have then I(X; Y ) = H(Y ) H(Y X) = H(r) H() = H ( 2) H ( 2) = 0

15 6 CAPACITY OF THE BINARY SYMMETRIC CHANNEL 5 Bibliograh S. Benedetto, E. Biglieri - Princiles of Digital Transmission with Wireless Alications, Kluwer, 999. Simon Hakin - Communication Sstems, 4.th edition, Wile, 200. C. E. Shannon - A mathematical theor of communication, Bell Sst. Tech. J., vol. 27, , , Jul-Oct S. Verdú - Fift ears of Shannon theor, IEEE Trans. Info. Theor, vol. 44, , Oct. 998.

ECE 534 Information Theory - Midterm 2

ECE 534 Information Theory - Midterm 2 ECE 534 Information Theory - Midterm Nov.4, 009. 3:30-4:45 in LH03. You will be given the full class time: 75 minutes. Use it wisely! Many of the roblems have short answers; try to find shortcuts. You

More information

Improved Capacity Bounds for the Binary Energy Harvesting Channel

Improved Capacity Bounds for the Binary Energy Harvesting Channel Imroved Caacity Bounds for the Binary Energy Harvesting Channel Kaya Tutuncuoglu 1, Omur Ozel 2, Aylin Yener 1, and Sennur Ulukus 2 1 Deartment of Electrical Engineering, The Pennsylvania State University,

More information

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits

More information

Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications;

Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications; Formal Modeling in Cognitive Science Lecture 9: and ; ; Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk Proerties of 3 March, 6 Frank Keller Formal Modeling in Cognitive

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

The decision-feedback equalizer optimization for Gaussian noise

The decision-feedback equalizer optimization for Gaussian noise Journal of Theoretical and Alied Comuter Science Vol. 8 No. 4 4. 5- ISSN 99-634 (rinted 3-5653 (online htt://www.jtacs.org The decision-feedback eualizer otimization for Gaussian noise Arkadiusz Grzbowski

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK Comuter Modelling and ew Technologies, 5, Vol.9, o., 3-39 Transort and Telecommunication Institute, Lomonosov, LV-9, Riga, Latvia MATHEMATICAL MODELLIG OF THE WIRELESS COMMUICATIO ETWORK M. KOPEETSK Deartment

More information

On Code Design for Simultaneous Energy and Information Transfer

On Code Design for Simultaneous Energy and Information Transfer On Code Design for Simultaneous Energy and Information Transfer Anshoo Tandon Electrical and Comuter Engineering National University of Singaore Email: anshoo@nus.edu.sg Mehul Motani Electrical and Comuter

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Lecture 21: Quantum Communication

Lecture 21: Quantum Communication CS 880: Quantum Information Processing 0/6/00 Lecture : Quantum Communication Instructor: Dieter van Melkebeek Scribe: Mark Wellons Last lecture, we introduced the EPR airs which we will use in this lecture

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code. Convolutional Codes Goals Lecture Be able to encode using a convolutional code Be able to decode a convolutional code received over a binary symmetric channel or an additive white Gaussian channel Convolutional

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

Convex Optimization methods for Computing Channel Capacity

Convex Optimization methods for Computing Channel Capacity Convex Otimization methods for Comuting Channel Caacity Abhishek Sinha Laboratory for Information and Decision Systems (LIDS), MIT sinhaa@mit.edu May 15, 2014 We consider a classical comutational roblem

More information

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

Universal Finite Memory Coding of Binary Sequences

Universal Finite Memory Coding of Binary Sequences Deartment of Electrical Engineering Systems Universal Finite Memory Coding of Binary Sequences Thesis submitted towards the degree of Master of Science in Electrical and Electronic Engineering in Tel-Aviv

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

V-0. Review of Probability

V-0. Review of Probability V-0. Review of Probabilit Random Variable! Definition Numerical characterization of outcome of a random event!eamles Number on a rolled die or dice Temerature at secified time of da 3 Stock Market at close

More information

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B.

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B. EE/Stats 376A: Information theory Winter 207 Lecture 5 Jan 24 Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B. 5. Outline Markov chains and stationary distributions Prefix codes

More information

Lecture 4. Capacity of Fading Channels

Lecture 4. Capacity of Fading Channels 1 Lecture 4. Capacity of Fading Channels Capacity of AWGN Channels Capacity of Fading Channels Ergodic Capacity Outage Capacity Shannon and Information Theory Claude Elwood Shannon (April 3, 1916 February

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Anytime communication over the Gilbert-Eliot channel with noiseless feedback

Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anant Sahai, Salman Avestimehr, Paolo Minero Deartment of Electrical Engineering and Comuter Sciences University of California

More information

On the capacity of the general trapdoor channel with feedback

On the capacity of the general trapdoor channel with feedback On the caacity of the general tradoor channel with feedback Jui Wu and Achilleas Anastasooulos Electrical Engineering and Comuter Science Deartment University of Michigan Ann Arbor, MI, 48109-1 email:

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

An Introduction to Information Theory: Notes

An Introduction to Information Theory: Notes An Introduction to Information Theory: Notes Jon Shlens jonshlens@ucsd.edu 03 February 003 Preliminaries. Goals. Define basic set-u of information theory. Derive why entroy is the measure of information

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

School of Computer and Communication Sciences. Information Theory and Coding Notes on Random Coding December 12, 2003.

School of Computer and Communication Sciences. Information Theory and Coding Notes on Random Coding December 12, 2003. ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE School of Computer and Communication Sciences Handout 8 Information Theor and Coding Notes on Random Coding December 2, 2003 Random Coding In this note we prove

More information

7. Two Random Variables

7. Two Random Variables 7. Two Random Variables In man eeriments the observations are eressible not as a single quantit but as a amil o quantities. or eamle to record the height and weight o each erson in a communit or the number

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Interactive Hypothesis Testing Against Independence

Interactive Hypothesis Testing Against Independence 013 IEEE International Symosium on Information Theory Interactive Hyothesis Testing Against Indeendence Yu Xiang and Young-Han Kim Deartment of Electrical and Comuter Engineering University of California,

More information

LDPC codes for the Cascaded BSC-BAWGN channel

LDPC codes for the Cascaded BSC-BAWGN channel LDPC codes for the Cascaded BSC-BAWGN channel Aravind R. Iyengar, Paul H. Siegel, and Jack K. Wolf University of California, San Diego 9500 Gilman Dr. La Jolla CA 9093 email:aravind,siegel,jwolf@ucsd.edu

More information

Design, fabrication and testing of high performance fiber optic depolarizer

Design, fabrication and testing of high performance fiber optic depolarizer Design, fabrication and testing of high erformance fiber otic deolarizer Jagannath Naak *a, Pradee Kumar a, Himansu Shekhar Pattanaik b, S. Sarath Chandra b a Research Center Imarat, Vignana Kancha, Hderabad,

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

HetNets: what tools for analysis?

HetNets: what tools for analysis? HetNets: what tools for analysis? Daniela Tuninetti (Ph.D.) Email: danielat@uic.edu Motivation Seven Ways that HetNets are a Cellular Paradigm Shift, by J. Andrews, IEEE Communications Magazine, March

More information

Lecture Thermodynamics 9. Entropy form of the 1 st law. Let us start with the differential form of the 1 st law: du = d Q + d W

Lecture Thermodynamics 9. Entropy form of the 1 st law. Let us start with the differential form of the 1 st law: du = d Q + d W Lecture hermodnamics 9 Entro form of the st law Let us start with the differential form of the st law: du = d Q + d W Consider a hdrostatic sstem. o know the required d Q and d W between two nearb states,

More information

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved. Introduction to Wireless & Mobile Systems Chapter 4 Channel Coding and Error Control 1 Outline Introduction Block Codes Cyclic Codes CRC (Cyclic Redundancy Check) Convolutional Codes Interleaving Information

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

A PROBABILISTIC POWER ESTIMATION METHOD FOR COMBINATIONAL CIRCUITS UNDER REAL GATE DELAY MODEL

A PROBABILISTIC POWER ESTIMATION METHOD FOR COMBINATIONAL CIRCUITS UNDER REAL GATE DELAY MODEL A PROBABILISTIC POWER ESTIMATION METHOD FOR COMBINATIONAL CIRCUITS UNDER REAL GATE DELAY MODEL G. Theodoridis, S. Theoharis, D. Soudris*, C. Goutis VLSI Design Lab, Det. of Electrical and Comuter Eng.

More information

Analysis of Multi-Hop Emergency Message Propagation in Vehicular Ad Hoc Networks

Analysis of Multi-Hop Emergency Message Propagation in Vehicular Ad Hoc Networks Analysis of Multi-Ho Emergency Message Proagation in Vehicular Ad Hoc Networks ABSTRACT Vehicular Ad Hoc Networks (VANETs) are attracting the attention of researchers, industry, and governments for their

More information

3F1 Information Theory, Lecture 1

3F1 Information Theory, Lecture 1 3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:

More information

Coding Along Hermite Polynomials for Gaussian Noise Channels

Coding Along Hermite Polynomials for Gaussian Noise Channels Coding Along Hermite olynomials for Gaussian Noise Channels Emmanuel A. Abbe IG, EFL Lausanne, 1015 CH Email: emmanuel.abbe@efl.ch Lizhong Zheng LIDS, MIT Cambridge, MA 0139 Email: lizhong@mit.edu Abstract

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

Cooperative Communication with Feedback via Stochastic Approximation

Cooperative Communication with Feedback via Stochastic Approximation Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu

More information

Information Theory, Statistics, and Decision Trees

Information Theory, Statistics, and Decision Trees Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Improved Identification of Nonlinear Dynamic Systems using Artificial Immune System

Improved Identification of Nonlinear Dynamic Systems using Artificial Immune System Imroved Identification of Nonlinear Dnamic Sstems using Artificial Immune Sstem Satasai Jagannath Nanda, Ganaati Panda, Senior Member IEEE and Babita Majhi Deartment of Electronics and Communication Engineering,

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Chapter 10. Supplemental Text Material

Chapter 10. Supplemental Text Material Chater 1. Sulemental Tet Material S1-1. The Covariance Matri of the Regression Coefficients In Section 1-3 of the tetbook, we show that the least squares estimator of β in the linear regression model y=

More information

Basic information theory

Basic information theory Basic information theory Communication system performance is limited by Available signal power Background noise Bandwidth limits. Can we postulate an ideal system based on physical principles, against

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Planar Transformations and Displacements

Planar Transformations and Displacements Chater Planar Transformations and Dislacements Kinematics is concerned with the roerties of the motion of oints. These oints are on objects in the environment or on a robot maniulator. Two features that

More information

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding Ch 0 Introduction 0.1 Overview of Information Theory and Coding Overview The information theory was founded by Shannon in 1948. This theory is for transmission (communication system) or recording (storage

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Introduction to Information Theory. Part 4

Introduction to Information Theory. Part 4 Introduction to Information Theory Part 4 A General Communication System CHANNEL Information Source Transmitter Channel Receiver Destination 10/2/2012 2 Information Channel Input X Channel Output Y 10/2/2012

More information

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points.

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points. Solved Problems Solved Problems P Solve the three simle classification roblems shown in Figure P by drawing a decision boundary Find weight and bias values that result in single-neuron ercetrons with the

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2005 Issued: March 7, 2005

More information

Decoding Linear Block Codes Using a Priority-First Search: Performance Analysis and Suboptimal Version

Decoding Linear Block Codes Using a Priority-First Search: Performance Analysis and Suboptimal Version IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 3, MAY 1998 133 Decoding Linear Block Codes Using a Priority-First Search Performance Analysis Subotimal Version Yunghsiang S. Han, Member, IEEE, Carlos

More information

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road 517583 QUESTION BANK (DESCRIPTIVE) Subject with Code : CODING THEORY & TECHNIQUES(16EC3810) Course & Branch: M.Tech - DECS

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

Distributed K-means over Compressed Binary Data

Distributed K-means over Compressed Binary Data 1 Distributed K-means over Comressed Binary Data Elsa DUPRAZ Telecom Bretagne; UMR CNRS 6285 Lab-STICC, Brest, France arxiv:1701.03403v1 [cs.it] 12 Jan 2017 Abstract We consider a networ of binary-valued

More information

Binary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1

Binary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1 5 Conference on Information Sciences and Systems, The Johns Hopkins University, March 6 8, 5 inary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity Ahmed O.

More information

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels Asymptotic istortion Performance of Source-Channel iversity Schemes over Relay Channels Karim G. Seddik 1, Andres Kwasinski 2, and K. J. Ray Liu 1 1 epartment of Electrical and Computer Engineering, 2

More information

Channel Coding 1. Sportturm (SpT), Room: C3165

Channel Coding 1.   Sportturm (SpT), Room: C3165 Channel Coding Dr.-Ing. Dirk Wübben Institute for Telecommunications and High-Frequency Techniques Department of Communications Engineering Room: N3, Phone: 4/8-6385 Sportturm (SpT), Room: C365 wuebben@ant.uni-bremen.de

More information

CHAPTER 5 STATISTICAL INFERENCE. 1.0 Hypothesis Testing. 2.0 Decision Errors. 3.0 How a Hypothesis is Tested. 4.0 Test for Goodness of Fit

CHAPTER 5 STATISTICAL INFERENCE. 1.0 Hypothesis Testing. 2.0 Decision Errors. 3.0 How a Hypothesis is Tested. 4.0 Test for Goodness of Fit Chater 5 Statistical Inference 69 CHAPTER 5 STATISTICAL INFERENCE.0 Hyothesis Testing.0 Decision Errors 3.0 How a Hyothesis is Tested 4.0 Test for Goodness of Fit 5.0 Inferences about Two Means It ain't

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

s v 0 q 0 v 1 q 1 v 2 (q 2) v 3 q 3 v 4

s v 0 q 0 v 1 q 1 v 2 (q 2) v 3 q 3 v 4 Discrete Adative Transmission for Fading Channels Lang Lin Λ, Roy D. Yates, Predrag Sasojevic WINLAB, Rutgers University 7 Brett Rd., NJ- fllin, ryates, sasojevg@winlab.rutgers.edu Abstract In this work

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

SENS'2006 Second Scientific Conference with International Participation SPACE, ECOLOGY, NANOTECHNOLOGY, SAFETY June 2006, Varna, Bulgaria

SENS'2006 Second Scientific Conference with International Participation SPACE, ECOLOGY, NANOTECHNOLOGY, SAFETY June 2006, Varna, Bulgaria SENS'6 Second Scientific Conference with International Participation SPACE, ECOLOGY, NANOTECHNOLOGY, SAFETY 4 6 June 6, Varna, Bulgaria SIMULATION ANALYSIS OF THE VITERBI CONVOLUTIONAL DECODING ALGORITHM

More information

ALTERNATIVE SOLUTION TO THE QUARTIC EQUATION by Farid A. Chouery 1, P.E. 2006, All rights reserved

ALTERNATIVE SOLUTION TO THE QUARTIC EQUATION by Farid A. Chouery 1, P.E. 2006, All rights reserved ALTERNATIVE SOLUTION TO THE QUARTIC EQUATION b Farid A. Chouer, P.E. 006, All rights reserved Abstract A new method to obtain a closed form solution of the fourth order olnomial equation is roosed in this

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Computation of Total Capacity for Discrete Memoryless Multiple-Access Channels

Computation of Total Capacity for Discrete Memoryless Multiple-Access Channels IEEE TRANSACTIONS ON INFORATION THEORY, VOL. 50, NO. 11, NOVEBER 2004 2779 Computation of Total Capacit for Discrete emorless ultiple-access Channels ohammad Rezaeian, ember, IEEE, and Ale Grant, Senior

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Amin, Osama; Abediseid, Walid; Alouini, Mohamed-Slim. Institute of Electrical and Electronics Engineers (IEEE)

Amin, Osama; Abediseid, Walid; Alouini, Mohamed-Slim. Institute of Electrical and Electronics Engineers (IEEE) KAUST Reository Outage erformance of cognitive radio systems with Imroer Gaussian signaling Item tye Authors Erint version DOI Publisher Journal Rights Conference Paer Amin Osama; Abediseid Walid; Alouini

More information

Introduction to Information Theory

Introduction to Information Theory Introduction to Information Theory Gurinder Singh Mickey Atwal atwal@cshl.edu Center for Quantitative Biology Kullback-Leibler Divergence Summary Shannon s coding theorems Entropy Mutual Information Multi-information

More information

Elliptic Curves and Cryptography

Elliptic Curves and Cryptography Ellitic Curves and Crytograhy Background in Ellitic Curves We'll now turn to the fascinating theory of ellitic curves. For simlicity, we'll restrict our discussion to ellitic curves over Z, where is a

More information

Resonances in high-contrast gratings with complex unit cell topology

Resonances in high-contrast gratings with complex unit cell topology Resonances in high-contrast gratings with comle unit cell toology Milan Maksimovic Focal-Vision & Otics, Oldenzaal, The Netherlands The XXI International Worksho on Otical Wave & Waveguide Theory and Numerical

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

DSP IC, Solutions. The pseudo-power entering into the adaptor is: 2 b 2 2 ) (a 2. Simple, but long and tedious simplification, yields p = 0.

DSP IC, Solutions. The pseudo-power entering into the adaptor is: 2 b 2 2 ) (a 2. Simple, but long and tedious simplification, yields p = 0. 5 FINITE WORD LENGTH EFFECTS 5.4 For a two-ort adator we have: b a + α(a a ) b a + α(a a ) α R R R + R The seudo-ower entering into the adator is: R (a b ) + R (a b ) Simle, but long and tedious simlification,

More information

Chapter 7 Sampling and Sampling Distributions. Introduction. Selecting a Sample. Introduction. Sampling from a Finite Population

Chapter 7 Sampling and Sampling Distributions. Introduction. Selecting a Sample. Introduction. Sampling from a Finite Population Chater 7 and s Selecting a Samle Point Estimation Introduction to s of Proerties of Point Estimators Other Methods Introduction An element is the entity on which data are collected. A oulation is a collection

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information