Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Size: px
Start display at page:

Download "Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals"

Transcription

1 Information Sources Professor A. Manikas Imperial College London EE303 - Communication Systems An Overview of Fundamentals Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

2 Table of Contents 1 Introduction 2 Classification of Information Sources 3 Discrete Memoryless Sources (DMS) 4 Measure of Information Generated by a DMS Source Source Entropy Source Information Rate Examples 5 A Joint DMS Source 6 Markov Discrete Information Sources 7 Continuous Sources/Signals 8 Measure of Information Generated by a Continuous Source Continuous Source: Di erential Entropy Important Relationships 9 A Note on Information Sinks Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

3 Introduction Introduction Wireline and fiber communications as well as wireless communications are fully digital. The general block structure of a Digital Communication System is shown in the following page. it is common practice its quality to be expressed in terms of the accuracy with which the binary digits delivered at the output of the detector (point bb2) represent the binary digits that were fed into the digital modulator (point B2). It is generally taken that it is the fraction of the binary digits that are delivered back in error that is a measure of the quality of the communication system. This fraction, or rate, is referred to as the bit error probability,or,bit-error-rate BER (point bb2). Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

4 Introduction Block Structure of a Digital Comm System Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

5 Classification of Information Sources N.B. - terminology: "continuous" = "analogue" Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23 Classification of Information Sources Information sources (or communication sources), or simply sources can be classified as I Discrete F Discrete Memoryless Sources (MDS), F with Memory (e.g. Markov Sources) I Continuous F non-gaussian F Gaussian Examples with reference to previous page s figure: I continuous: up to points A, A1, ort I discrete: F up to points A2 -levelsofquantiser,or F up to points B, B1,or B2 -binarydigitsorbinarycodewords. The "inverse" of an information/communication source is an information/communication sink (discrete or continuous)

6 Discrete Memoryless Sources (DMS) Discrete Memoryless Sources A source is called a Discrete Source if produces a sequence {X [n]} of symbols one after another, with each symbol being drawn from a finite alphabet X, {x 1, x 2,...,x M } (1) with a rate r X symbols/, and in which each symbol x m 2 X is produced at the output of the source with some associated probability Pr(x m ) - abbreviated p m,i.e. Pr(x m ), p m (2) If successive outputs from a discrete source are statistically independent, or in other words, if at each instant of time the source chooses for transmission one symbol from the set X = {x 1, x 2,...,x M } and its choices are independent from one time instant to the next, then the source is called a Discrete Memoryless Source (DMS). Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

7 Discrete Memoryless Sources (DMS) If p represents the vector with elements the probabilities associated with the symbols of a source i.e. then the set p, [p 1, p 2,...,p M ] T (3) " X, p #, {(x1, p 1 ), (x 2, p 2 ),...,(x M, p M )} (4) with is defined as the Source Ensemble M p m = 1 (5) m=1 Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

8 Discrete Memoryless Sources (DMS) A DMS source can be fully described by its ensemble " X, p # and its symbol rate (symbols per ond) r X The point A2 may be considered as the input of a Digital Communication System where messages consist of sequences of "symbols" selected from an alphabet e.g. levels of a quantizer or telegraph letters, numbers and punctuations. Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

9 Measure of Information Generated by a DMS Source Source Entropy Measure of Information Generated by a DMS Source Source Entropy Consider a discrete memoryless information source 9 >= " X, p # = 8> < >: (x 1, p 1 ) (x 2, p 2 )... (x M, p M ) >; with M p m = 1 (6) m=1 Then, the average information per symbol generated by the source is given by the so-called entropy of the source i.e. H X, H X (p) = " M m=1 p m log 2 (p m ) bits/symbol (7) {z },"I(x m ) = "p T log 2 (p) bits/symbol (8) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

10 Measure of Information Generated by a DMS Source Source Information Rate Source Information Rate H X is a measure of the a priori uncertainty associated with the source output or, equivalently, a measure of the information obtained when the source output is observed. The notion of entropy is not restricted to the case where the ensemble is finite i.e. M may be. It can also be shown that H X is the minimum number of binary digits bits needed to encode the source output. Based on entropy, the average information bit-rate at the output of the source is defined as follows: r inf = r X# symbols. H X # bits symbol bits/ (9) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

11 Measure of Information Generated by a DMS Source Source Information Rate Thus, we have two types of bits and, therefore, two types of rates. That is, I data rate : r b in data bits or simply bits I info rate : r inf in information bits or simply bits NB: I I In general: In ideal systems: r inf $ r b (10) r inf = r b (11) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

12 Measure of Information Generated by a DMS Source Examples Example 1 If X = {0, 1} with probablities / 0 Pr(0) p = = Pr(1) / (12) then i.e. data rate : r b = r X = 10 bits entropy : H X = 1 bits = 1 info bit data bit info rate : r inf = r X.H X = 10 bits r b = r inf = 10 bits (13) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

13 Measure of Information Generated by a DMS Source Examples Example 2 If X = {0, 1} with probablities / 0 Pr(0) p = = Pr(1) / (14) then i.e. data rate : r b = r X = 10 bits entropy : H X = bits info rate : r inf = r X.H X = bits = info bit data bit r b > r inf (15) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

14 A Joint DMS Source A Joint DMSSource I " X, p # " Y, q # = = 8 >< >: 8 >< >: (x 1, p 1 ) (x 2, p 2 )... (x M, p M ) (y 1, q 1 ) (y 2, q 2 )... (y K, q K ) 9 >= >; 9 >= >; Consider two information sources with the ensembles " X, p # and " Y, q #, respectively, defined as follows with with M p m = 1 (16) m=1 K p k = 1 (17) k =1 where p 1 p 2 z } { z } { z } { p = [ Pr(x 1 ), Pr(x 2 ),..., Pr(x M )] T (18) q 1 q 2 z } { z } { z } { q = [ Pr(y 1 ), Pr(y 2 ),..., Pr(y K )] T (19) p M q K Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

15 A Joint DMS Source Let us form a joined source where its symbols are taken to be pairs of symbols drawn from the two original sources X and Y. The new source with alphabet {(x 1, y 1 ), (x 1, y 2 ),...,(x 2, y 1 ),...,(x m, y k ),...,(x M, y K )} is known as joint source and its ensemble as joint ensemble defined as follows: = 8 >< (X % Y, J) = (x 1, y 1 ), Pr(x 1, y 1 ) (x 1, y 2 ), Pr(x 1, y 2 )... (x m, y k ), Pr(x m, y k )... (x M, y K ), Pr(x M, y K ) 1 >: >; 80 9 >< >= B m, y k ), Pr(x m, y k ) A, 8mk : 1 $ m $ M, 1 $ k $ K >: {z } >; =J km 9 >= (20) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

16 A Joint DMS Source Note that the joint probabilistic relationship between the symbols of X and Y, described by the so-called joint-probability matrix, 2 3T,J 11,J 12 z } { z } 1,J 1K 1 { z } { Pr(x 1, y 1 ), Pr(x 1, y 2 ),..., Pr(x 1, y K ),J 21 z } 2,J 22,J 2K { z } { z } 2 { J = Pr(x 2, y 1 ), Pr(x 2, y 2 ),..., Pr(x 2, y K )...,...,..., ,J M 1,J M 2 z } M,J MK { z } M { z } M { 5 Pr(x M, y 1 ), Pr(x M, y 2 ),..., Pr(x M, y K ) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

17 A Joint DMS Source H X, " H Y, " H X %Y, " M m=1 K k=1 M m=1 p m log 2 (p m ) {z },"I(x m ) q k log 2 (q k ) {z },"I(x m ) K k=1 0 J km log 2 (J km ) {z } = "p T log 2 (p) = "q T log 2 (q),"i(x m,y k ) 1 = "1 T B C log 2 (J) A 1 {z } M K %M matrix bits symbol bits symbol bits symbol (21) (22) (23) (24) where 1 M = a column vector of M ones Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

18 Markov Discrete Information Sources Markov Discrete Information Sources Entropy A DMS can be modelled as with a single-state transition diagram Markov sources are modelled by Markov state transition diagrams of N states (N > 1), e.g. Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

19 Markov Discrete Information Sources The entropy of a Markov Source is given as follows: where H = N Pr(s i ).H i i=1 N = number of states bits symbol (25) Pr(s i ) = Probability to be on the i th state s i H i = entropy of the i th state (considered as a DMS) Example: Consider a two-state Markov source which gives the symbols A,B,C according to the following model: Then (for you) I H 1 =? I H 2 =? I H =? bits/symbol Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

20 Continuous Sources/Signals Continuous Sources/Signals A source whose output is a continuous signal in both amplitude and time is called a Continuous Source. However, this definition may be relaxed to include also signals that are continuous in amplitude but discrete in time, e.g. point A2. That is the condition is relaxed to only "continuous in amplitude". A continuous source which relates directly to analogue signal waveforms is described by its ensemble " g(t), pdf g (g) # where pdf g (g) denotes the amplitude probability density function of g(t). However, in order to describe fully a continuous source g(t), in addition to the source ensemble, the following parameters of g(t) should be specified/identified I the power spectral density PSD g (f ), I the bandwidth. Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

21 Measure of Information Generated by a Continuous Source Continuous Source: Di erential Entropy Measure of Information Generated by a Continuous Source Continuous Source: Di erential Entropy Entropy of an analoque source = Di erential entropy: Z H g, " pdf g (g). log 2 (pdf g (g)).dg (26) " The term "entropy" in this course (for analogue signals) will be used to refer to "di erential entropy" Entropy of a Gaussian Source H g = log 2 q2πeσ 2 g = max (27) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

22 Measure of Information Generated by a Continuous Source Important Relationships Important Relationships At the output of an information source g(t) the following are very important: Entropy : H g (28) max(entropy) : Gaussian Entropy = log 2 q2πeσ 2 g (29) Entropy Power : N g, 1 2πe 22H g (30) Average Power : P g, E? g(t) 2@ (31) In general : P g = N g (32) if pdf g = Gaussian ) P g = N g (33) if pdf g 6= Gaussian ) P g > N g (34) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

23 A Note on Information Sinks ANoteonInformationSinks A communication sink is the destination of the symbols produced by a source and transmitted from the input to the output of a channel. It has one input and no output and can be seen as the inverse of a source. That is, matching its associated source, a sink is either I continuous, or I discrete Examples (with reference to the block structure of a comm system): I I continuous: from ponts ba, ba1, orbt discrete: form points bb, bb1 or bb2, Asinkisdescribedby 1 a performance (fidelity) criterion ρ(x, Y ) F F e.g. criterion for a discrete sink: BER e.g. criterion for a continuous sink: SNR 2 a maximum allowed distortion D max (e.g. D max =10 "3 i.e. BER<10 "3 ) Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct / 23

Digital Communications: An Overview of Fundamentals

Digital Communications: An Overview of Fundamentals IMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING. COMPACT LECTURE NOTES on COMMUNICATION THEORY. Prof Athanassios Manikas, Spring 2001 Digital Communications:

More information

EE401: Advanced Communication Theory

EE401: Advanced Communication Theory EE401: Advanced Communication Theory Professor A. Manikas Chair of Communications and Array Processing Imperial College London Introductory Concepts Prof. A. Manikas (Imperial College) EE.401: Introductory

More information

EE303: Communication Systems

EE303: Communication Systems EE303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London Introductory Concepts Prof. A. Manikas (Imperial College) EE303: Introductory Concepts

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

E303: Communication Systems

E303: Communication Systems E303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London Principles of PCM Prof. A. Manikas (Imperial College) E303: Principles of PCM v.17

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

Digital communication system. Shannon s separation principle

Digital communication system. Shannon s separation principle Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur Module ntroduction to Digital Communications and nformation Theory Lesson 3 nformation Theoretic Approach to Digital Communications After reading this lesson, you will learn about Scope of nformation Theory

More information

Information and Entropy

Information and Entropy Information and Entropy Shannon s Separation Principle Source Coding Principles Entropy Variable Length Codes Huffman Codes Joint Sources Arithmetic Codes Adaptive Codes Thomas Wiegand: Digital Image Communication

More information

Basic information theory

Basic information theory Basic information theory Communication system performance is limited by Available signal power Background noise Bandwidth limits. Can we postulate an ideal system based on physical principles, against

More information

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10 Digital Band-pass Modulation PROF. MICHAEL TSAI 211/11/1 Band-pass Signal Representation a t g t General form: 2πf c t + φ t g t = a t cos 2πf c t + φ t Envelope Phase Envelope is always non-negative,

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Digital Transmission Methods S

Digital Transmission Methods S Digital ransmission ethods S-7.5 Second Exercise Session Hypothesis esting Decision aking Gram-Schmidt method Detection.K.K. Communication Laboratory 5//6 Konstantinos.koufos@tkk.fi Exercise We assume

More information

Soft-Output Trellis Waveform Coding

Soft-Output Trellis Waveform Coding Soft-Output Trellis Waveform Coding Tariq Haddad and Abbas Yongaçoḡlu School of Information Technology and Engineering, University of Ottawa Ottawa, Ontario, K1N 6N5, Canada Fax: +1 (613) 562 5175 thaddad@site.uottawa.ca

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1

More information

Lecture 5b: Line Codes

Lecture 5b: Line Codes Lecture 5b: Line Codes Dr. Mohammed Hawa Electrical Engineering Department University of Jordan EE421: Communications I Digitization Sampling (discrete analog signal). Quantization (quantized discrete

More information

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

EE5713 : Advanced Digital Communications

EE5713 : Advanced Digital Communications EE5713 : Advanced Digital Communications Week 12, 13: Inter Symbol Interference (ISI) Nyquist Criteria for ISI Pulse Shaping and Raised-Cosine Filter Eye Pattern Equalization (On Board) 20-May-15 Muhammad

More information

Digital Modulation 1

Digital Modulation 1 Digital Modulation 1 Lecture Notes Ingmar Land and Bernard H. Fleury Navigation and Communications () Department of Electronic Systems Aalborg University, DK Version: February 5, 27 i Contents I Basic

More information

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN A Thesis Presented to The Academic Faculty by Bryan Larish In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

More information

Compression and Coding

Compression and Coding Compression and Coding Theory and Applications Part 1: Fundamentals Gloria Menegaz 1 Transmitter (Encoder) What is the problem? Receiver (Decoder) Transformation information unit Channel Ordering (significance)

More information

EE 121: Introduction to Digital Communication Systems. 1. Consider the following discrete-time communication system. There are two equallly likely

EE 121: Introduction to Digital Communication Systems. 1. Consider the following discrete-time communication system. There are two equallly likely EE 11: Introduction to Digital Communication Systems Midterm Solutions 1. Consider the following discrete-time communication system. There are two equallly likely messages to be transmitted, and they are

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Basic Principles of Video Coding

Basic Principles of Video Coding Basic Principles of Video Coding Introduction Categories of Video Coding Schemes Information Theory Overview of Video Coding Techniques Predictive coding Transform coding Quantization Entropy coding Motion

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Modern Digital Communication Techniques Prof. Suvra Sekhar Das G. S. Sanyal School of Telecommunication Indian Institute of Technology, Kharagpur

Modern Digital Communication Techniques Prof. Suvra Sekhar Das G. S. Sanyal School of Telecommunication Indian Institute of Technology, Kharagpur Modern Digital Communication Techniques Prof. Suvra Sekhar Das G. S. Sanyal School of Telecommunication Indian Institute of Technology, Kharagpur Lecture - 15 Analog to Digital Conversion Welcome to the

More information

Estimation of the Capacity of Multipath Infrared Channels

Estimation of the Capacity of Multipath Infrared Channels Estimation of the Capacity of Multipath Infrared Channels Jeffrey B. Carruthers Department of Electrical and Computer Engineering Boston University jbc@bu.edu Sachin Padma Department of Electrical and

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information

MODULATION AND CODING FOR QUANTIZED CHANNELS. Xiaoying Shao and Harm S. Cronie

MODULATION AND CODING FOR QUANTIZED CHANNELS. Xiaoying Shao and Harm S. Cronie MODULATION AND CODING FOR QUANTIZED CHANNELS Xiaoying Shao and Harm S. Cronie x.shao@ewi.utwente.nl, h.s.cronie@ewi.utwente.nl University of Twente, Faculty of EEMCS, Signals and Systems Group P.O. box

More information

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction to Information Theory By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction [B.P. Lathi] Almost in all the means of communication, none produces error-free communication.

More information

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0 Part II Information Theory Concepts Chapter 2 Source Models and Entropy Any information-generating process can be viewed as a source: { emitting a sequence of symbols { symbols from a nite alphabet text:

More information

Shannon s A Mathematical Theory of Communication

Shannon s A Mathematical Theory of Communication Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the

More information

E303: Communication Systems

E303: Communication Systems E303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London An Overview of Fundamentals: PN-codes/signals & Spread Spectrum (Part B) Prof. A. Manikas

More information

Summary: SER formulation. Binary antipodal constellation. Generic binary constellation. Constellation gain. 2D constellations

Summary: SER formulation. Binary antipodal constellation. Generic binary constellation. Constellation gain. 2D constellations TUTORIAL ON DIGITAL MODULATIONS Part 8a: Error probability A [2011-01-07] 07] Roberto Garello, Politecnico di Torino Free download (for personal use only) at: www.tlc.polito.it/garello 1 Part 8a: Error

More information

that efficiently utilizes the total available channel bandwidth W.

that efficiently utilizes the total available channel bandwidth W. Signal Design for Band-Limited Channels Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University Introduction We consider the problem of signal

More information

Signal Design for Band-Limited Channels

Signal Design for Band-Limited Channels Wireless Information Transmission System Lab. Signal Design for Band-Limited Channels Institute of Communications Engineering National Sun Yat-sen University Introduction We consider the problem of signal

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Uncertainity, Information, and Entropy

Uncertainity, Information, and Entropy Uncertainity, Information, and Entropy Probabilistic experiment involves the observation of the output emitted by a discrete source during every unit of time. The source output is modeled as a discrete

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding Ch 0 Introduction 0.1 Overview of Information Theory and Coding Overview The information theory was founded by Shannon in 1948. This theory is for transmission (communication system) or recording (storage

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Source Coding: Part I of Fundamentals of Source and Video Coding

Source Coding: Part I of Fundamentals of Source and Video Coding Foundations and Trends R in sample Vol. 1, No 1 (2011) 1 217 c 2011 Thomas Wiegand and Heiko Schwarz DOI: xxxxxx Source Coding: Part I of Fundamentals of Source and Video Coding Thomas Wiegand 1 and Heiko

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

UTA EE5362 PhD Diagnosis Exam (Spring 2011) EE5362 Spring 2 PhD Diagnosis Exam ID: UTA EE5362 PhD Diagnosis Exam (Spring 2) Instructions: Verify that your exam contains pages (including the cover shee. Some space is provided for you to show your

More information

EE-597 Notes Quantization

EE-597 Notes Quantization EE-597 Notes Quantization Phil Schniter June, 4 Quantization Given a continuous-time and continuous-amplitude signal (t, processing and storage by modern digital hardware requires discretization in both

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

Achievable Rates for Probabilistic Shaping

Achievable Rates for Probabilistic Shaping Achievable Rates for Probabilistic Shaping Georg Böcherer Mathematical and Algorithmic Sciences Lab Huawei Technologies France S.A.S.U. georg.boecherer@ieee.org May 3, 08 arxiv:707.034v5 cs.it May 08 For

More information

Compression methods: the 1 st generation

Compression methods: the 1 st generation Compression methods: the 1 st generation 1998-2017 Josef Pelikán CGG MFF UK Praha pepca@cgg.mff.cuni.cz http://cgg.mff.cuni.cz/~pepca/ Still1g 2017 Josef Pelikán, http://cgg.mff.cuni.cz/~pepca 1 / 32 Basic

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

at Some sort of quantization is necessary to represent continuous signals in digital form

at Some sort of quantization is necessary to represent continuous signals in digital form Quantization at Some sort of quantization is necessary to represent continuous signals in digital form x(n 1,n ) x(t 1,tt ) D Sampler Quantizer x q (n 1,nn ) Digitizer (A/D) Quantization is also used for

More information

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220 ECE 564/645 - Digital Communications, Spring 08 Midterm Exam # March nd, 7:00-9:00pm Marston 0 Overview The exam consists of four problems for 0 points (ECE 564) or 5 points (ECE 645). The points for each

More information

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design Chapter 4 Receiver Design Chapter 4 Receiver Design Probability of Bit Error Pages 124-149 149 Probability of Bit Error The low pass filtered and sampled PAM signal results in an expression for the probability

More information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70 Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks

More information

10/14/2009. Reading: Hambley Chapters

10/14/2009. Reading: Hambley Chapters EE40 Lec 14 Digital Signal and Boolean Algebra Prof. Nathan Cheung 10/14/2009 Reading: Hambley Chapters 7.1-7.4 7.4 Slide 1 Analog Signals Analog: signal amplitude is continuous with time. Amplitude Modulated

More information

Shannon Information Theory

Shannon Information Theory Chapter 3 Shannon Information Theory The information theory established by Shannon in 948 is the foundation discipline for communication systems, showing the potentialities and fundamental bounds of coding.

More information

TSKS01 Digital Communication Lecture 1

TSKS01 Digital Communication Lecture 1 TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director

More information

Image and Multidimensional Signal Processing

Image and Multidimensional Signal Processing Image and Multidimensional Signal Processing Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ Image Compression 2 Image Compression Goal: Reduce amount

More information

Graph-based Codes for Quantize-Map-and-Forward Relaying

Graph-based Codes for Quantize-Map-and-Forward Relaying 20 IEEE Information Theory Workshop Graph-based Codes for Quantize-Map-and-Forward Relaying Ayan Sengupta, Siddhartha Brahma, Ayfer Özgür, Christina Fragouli and Suhas Diggavi EPFL, Switzerland, UCLA,

More information

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral

More information

Kevin Buckley a i. communication. information source. modulator. encoder. channel. encoder. information. demodulator decoder. C k.

Kevin Buckley a i. communication. information source. modulator. encoder. channel. encoder. information. demodulator decoder. C k. Kevin Buckley - -4 ECE877 Information Theory & Coding for Digital Communications Villanova University ECE Department Prof. Kevin M. Buckley Lecture Set Review of Digital Communications, Introduction to

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Coding theory: Applications

Coding theory: Applications INF 244 a) Textbook: Lin and Costello b) Lectures (Tu+Th 12.15-14) covering roughly Chapters 1,9-12, and 14-18 c) Weekly exercises: For your convenience d) Mandatory problem: Programming project (counts

More information

BASICS OF DETECTION AND ESTIMATION THEORY

BASICS OF DETECTION AND ESTIMATION THEORY BASICS OF DETECTION AND ESTIMATION THEORY 83050E/158 In this chapter we discuss how the transmitted symbols are detected optimally from a noisy received signal (observation). Based on these results, optimal

More information

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University Quantization C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University http://www.csie.nctu.edu.tw/~cmliu/courses/compression/ Office: EC538 (03)5731877 cmliu@cs.nctu.edu.tw

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels Asymptotic istortion Performance of Source-Channel iversity Schemes over Relay Channels Karim G. Seddik 1, Andres Kwasinski 2, and K. J. Ray Liu 1 1 epartment of Electrical and Computer Engineering, 2

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

EE4304 C-term 2007: Lecture 17 Supplemental Slides

EE4304 C-term 2007: Lecture 17 Supplemental Slides EE434 C-term 27: Lecture 17 Supplemental Slides D. Richard Brown III Worcester Polytechnic Institute, Department of Electrical and Computer Engineering February 5, 27 Geometric Representation: Optimal

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 08 December 2009 This examination consists of

More information

Iterative Quantization. Using Codes On Graphs

Iterative Quantization. Using Codes On Graphs Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs Lossy Data Compression: Encoding: Map source

More information

Draft of a lecture with exercises

Draft of a lecture with exercises Draft of a lecture with exercises Piotr Chołda March, 207 Introduction to the Course. Teachers: Lecture and exam: Piotr Chołda, Ph.D., Dr.habil. Room: 3, pav. D5 (first floor). Office hours: please contact

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

1. SIGNALS AND INFORMATION AND CLASSIFICATION OF SIGNALS

1. SIGNALS AND INFORMATION AND CLASSIFICATION OF SIGNALS 1. SIGNALS AND INFORMATION AND CLASSIFICATION OF SIGNALS 1.0 TERMINOLOGY Some preliminary definitions are required in order to be able to tackle the theory that is related to signals. Signal and Information

More information

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g Exercise Generator polynomials of a convolutional code, given in binary form, are g 0, g 2 0 ja g 3. a) Sketch the encoding circuit. b) Sketch the state diagram. c) Find the transfer function TD. d) What

More information