Digital Communications: An Overview of Fundamentals

Size: px
Start display at page:

Download "Digital Communications: An Overview of Fundamentals"

Transcription

1 IMPERIAL COLLEGE of SCIENCE, TECHNOLOGY and MEDICINE, DEPARTMENT of ELECTRICAL and ELECTRONIC ENGINEERING. COMPACT LECTURE NOTES on COMMUNICATION THEORY. Prof Athanassios Manikas, Spring 2001 Digital Communications: An Overview of Fundamentals Aims: To provide the foundations of a typical Digital Communication System (DCS), in a block-diagram structure, and overviewing its operation. To define the main performance criteria for DCS and the associated parameter planes with emphasis given to the energy utilisation efficiency and bandwidth utilisation efficiency. To identify the theoretical limits on the performance of DCS To highlight system trade-offs. Information Sources

2 1. Introduction Wireline and fiber communications as well as wireless communications are fully digital. The general block structure of a Digital Communication System is shown in the following page and it is common practice its quality to be expressed in terms of the accuracy with which the binary digits delivered at the output of the detector (point B2 ^ ) represent the binary digits that were fed into the digital modulator (point B2). It is generally taken that it is the fraction of the binary digits that are delivered back in error that is a measure of the quality of the communication system. This fraction, or rate, is referred to as the bit error probability : /, or, Bit-Error-Rate BER (point B2 ^ ). Digital Communications - An Overview of Fundamentals 2 GENERAL BLOCK STRUCTURE OF A DIGITAL COMMUNICATION SYSTEM Con t. Info Source g(t) Sampler ( ) Fs N.B.:Fs>2Fg uantizer (, ) Source Encoder Discrete Channel Encoder Interleaver Channel Encoder Digital Modulator (M, Tcs) Analogue CHANNEL rcs=1/tcs bauds B k + 1 B H( f) f n(t) i Con t. Info Sink ^ g(t) 0 n(t) 0 SNRout =f( ) pe LPF -F g 1 +F g f ^ Source Decoder pe=f( EUE) ^ ^ ^ Discrete Channel Decoder DeInterleaver Digital Demodulator (M, Tcs) ^ (B,C) EUE BUE SNIRin Channel Decoder Discrete channel Digital Communications - An Overview of Fundamentals 3

3 1. Information Sources ì Information sources (or communication sources, or simply sources Ñ can be classified as ˆ Discrete, or ˆ Continuous ì The 'inverse' of an information/communication source is an information/communication sink (discrete or continuous) N.B. - terminology: 'continuous' 'analogue' Digital Communications - An Overview of Fundamentals Continuous Sources/Signals (Point-A) ìa source whose output 1Ð>Ñ is continuous signal in both amplitude and time is called a Continuous Source. ìa continuous source Ðwhich relates directly to analogue signal waveforms Ñ is described by its ensemble 1Ð>Ñß pdf ÐgÑwhere pdf ÐgÑ denotes the amplitude probability density function 1 1 of 1Ð>ÑÞ However, in order to describe fully a continuous source 1Ð>Ñ, in addition to the source ensemble, the power spectral density PSD 1 Ð0Ñ of the signal at the output has to be specified while its bandwidth F should also be identified. Digital Communications - An Overview of Fundamentals 5

4 1.2. Discrete Source (Point A2): ìa source which produces a sequence Ö\Ð8Ñ of symbols one after another, with each symbol being drawn from a finite alphabet \ œ ÖB" ß B# ß ÞÞÞß B with a rate < symbols/sec, and in which each symbol B \ is produced at the output of the \ 7 source with some associated probability PrÐB7Ñ, abbreviated : 7, i.e. PrÐB7Ñ œ : 7, is called a Discrete Source. ìif successive outputs from a discrete source are statistically independent, or in other words, if at each instant of time the source chooses for transmission one symbol from the set \ œ ÖB" ß B# ß ÞÞÞß B and its choices are independent from one time instant to the next, then the source is called a Discrete Memoryless Source ÐDMS Ñ. ìif : represents the vector with elements the probabilities associated with the symbols B7 \ of a source, then the set Ð\ß:Ñ ÐB ß : Ñ " " ÐB ß : Ñ # # Ð\ ß:Ñ œ with : 7 œ " ÞÞÞÞ 7œ" ÐBß : Ñ is defined as the Source Ensemble Digital Communications - An Overview of Fundamentals 6 ì A DMS source can be fully described by its ensemble Ð\ß :Ñ and its symbol rate (symbols per second) < \ The point A# may be considered as the input of a Digital Communication System where messages consist of sequences of "symbols" selected from an alphabet e.g. levels of a quantizer or telegraph letters, numbers and punctuations. Digital Communications - An Overview of Fundamentals 7

5 1.3. Measure of Information Generated by a Source Source Entropy ì Consider a discrete memoryless information source ÐB ß: Ñ " " ÐB# ß: # Ñ Ð\ ß: Ñ œ with : 7 œ " ÞÞÞÞ 7œ" ÐB ß: Ñ Then, the average information per symbol generated by the source is given by the so-called entropy of the source i.e. H\ œ H \ Ð: Ñ : 7.I ÐB 7 Ñ œ : 7. log # : 7 7œ" 7œ" Digital Communications - An Overview of Fundamentals 8 bits symbol ì H \ is a measure of the a priori uncertainty associated with the source output or, equivalently, a measure of the information obtained when the source output is observed. ì The notion of entropy is not restricted to the case where the ensemble is finite i.e. may be. It can also be shown that H \ is the minimum number of binary digits Ðbits Ñ needed to encode the source output. ì Based on entropy, the average information bit-rate output of the source is defined as follows: at the < < inf.h \ \ bits sec Digital Communications - An Overview of Fundamentals 9

6 ì Thus, we have two types of 'bits' and, therefore, two types of 'rates'. That is, data,3>=,3>= =/- =/- ˆ data rate œ <, in or simply info,3>=,3>= ˆ info rate œ < inf in =/- or simply =/- NB: ˆ In general: < inf Ÿ <, ˆ In ideal systems: < inf œ <, Digital Communications - An Overview of Fundamentals 10 ì Example if : data rate œ<, œ< B œ"! PrÐ!Ñ!Þ&,3> info,3> = = thenh =1 =1 PrÐ"Ñ!Þ& i.e. < <, œ inf X = ymbol data,3> info rate œ< œ< H œ"! inf B X,3>= =/-,3>= =/- Digital Communications - An Overview of Fundamentals 11

7 if : PrÐ"Ñ!Þ3 i.e. < <, inf X info rate,3>,3> = ymbol data,3> œ< inf œ< BHX œ 8.813,3>=,3>= data rate œ<, œ< B œ"! =/- PrÐ!Ñ!Þ7 s info = = thenh = = =/- Digital Communications - An Overview of Fundamentals A 'Joint' Source ì Consider two information sources with the ensembles Ð\ß :Ñ and Ð] ß ;Ñ, respectively, defined as follows: Ð\ß :Ñ œ ÐB ß : Ñ " " ÐB# ß :# Ñ ÞÞÞÞ ÐB ß : Ñ œ B7,Pr ÐB 7Ñ a7à"ÿ7ÿ œ: 7 Ð] ß ;Ñ œ ÐC ß ; Ñ " " ÐC# ß ;# Ñ ÞÞÞÞ ÐC ß ; Ñ O O œ C5,Pr ÐCÑ 5 a5à"ÿ5ÿo œ; 5 Digital Communications - An Overview of Fundamentals 13

8 ìlet us form a joined source where its symbols are taken to be pairs of symbols drawn from the two original sources \ and ]. The new source with alphabet Ö B ß C Ñß ÐB ß C Ñß ÞÞÞß ÐB ß C Ñ ( " " " # O is known as joint source defined as follows and its ensemble as joint ensemble Ð\ ], Ñ= ÐB 7ßC5Ñ,Pr ÐB7ßC5Ñ a75à"ÿ7ÿß"ÿ5ÿo œn 57 Digital Communications - An Overview of Fundamentals 14 Note that the joint probabilistic relationship between the symbols of \ and ], described by the so-called jointprobability matrix, œ PrÐB", C" Ñß PrÐB", C# Ñß ÞÞÞß PrÐB", COÑ PrÐB#, C" Ñ, PrÐB#, C# Ñ, ÞÞÞß PrÐB#, COÑ ÞÞÞ ÞÞÞ ÞÞÞ ÞÞÞ PrÐB, C Ñ, PrÐB, C Ñ, ÞÞÞ PrÐB, C Ñ " # O X Digital Communications - An Overview of Fundamentals 15

9 ì Digital Communications - An Overview of Fundamentals 16 ì The entropies associated with the above three sources \ß ] and \ ] are defined as follows: X H :. log : œ: log : \ 7 # 7 # 7œ" O X ] 5 log# 5 log# 5œ" O \ ] N57 N57 7œ" 5œ" H :. : œ; ; bits symbol bits symbol X H. log # œ 1 log # 1 O bits symbol where œ N ß N ß ÞÞÞß N N, N, ÞÞÞß N ÞÞÞ ÞÞÞ ÞÞÞ ÞÞÞ N ß N ß ÞÞÞ N "" "# " #" ## # O" O# O and 1 = " " ÞÞÞ " œ vectorof M ones Digital Communications - An Overview of Fundamentals 17

10 2. A Note on Information Sinks ì A communication sink is the destination of the symbols produced by a source and transmitted from the input to the output of a channel. It has one input and no output and it can be seen as the inverse of a source while it is described by 1. a performance (fidelity) criterion 3Ð\ß ] Ñ (e.g. criterion:fivñ 2. a maximum allowed distortion H 7+B $ $ (e.g. D œ"! i.e. FIV<10 Ñ 7+B Digital Communications - An Overview of Fundamentals Digital Communications - An Overview of Fundamentals 19

11 5) Markov Information Sources: Markov sources are modelled by Markov state transition diagrams and their entropy is given as follows: H œ N i=1 1 i.h bits/symbol i Rœnumber of states where 1 i œ probability to be on the i-th state H œ entropy of the i-th state Ðconsidered as a DMSÑ i Digital Communications - An Overview of Fundamentals 20 Example: Consider a two-state Markov source which gives the symbols A,B,C according to the following model: (A, ¾) (C, ¼) 1 2 (C, ¼) (B, ¾) Then H =? (for you) 1 H =? 2 # i= " and H= 1.H bits/symbol œ? i i Digital Communications - An Overview of Fundamentals 21

12 RÞFÞ À Digital Communications - An Overview of Fundamentals 22

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals Information Sources Professor A. Manikas Imperial College London EE303 - Communication Systems An Overview of Fundamentals Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct. 2011 1

More information

EE401: Advanced Communication Theory

EE401: Advanced Communication Theory EE401: Advanced Communication Theory Professor A. Manikas Chair of Communications and Array Processing Imperial College London Introductory Concepts Prof. A. Manikas (Imperial College) EE.401: Introductory

More information

Digital communication system. Shannon s separation principle

Digital communication system. Shannon s separation principle Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation

More information

EE303: Communication Systems

EE303: Communication Systems EE303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London Introductory Concepts Prof. A. Manikas (Imperial College) EE303: Introductory Concepts

More information

B œ c " " ã B œ c 8 8. such that substituting these values for the B 3 's will make all the equations true

B œ c   ã B œ c 8 8. such that substituting these values for the B 3 's will make all the equations true System of Linear Equations variables Ð unknowns Ñ B" ß B# ß ÞÞÞ ß B8 Æ Æ Æ + B + B ÞÞÞ + B œ, "" " "# # "8 8 " + B + B ÞÞÞ + B œ, #" " ## # #8 8 # ã + B + B ÞÞÞ + B œ, 3" " 3# # 38 8 3 ã + 7" B" + 7# B#

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Æ Å not every column in E is a pivot column (so EB œ! has at least one free variable) Æ Å E has linearly dependent columns

Æ Å not every column in E is a pivot column (so EB œ! has at least one free variable) Æ Å E has linearly dependent columns ßÞÞÞß @ is linearly independent if B" @" B# @# ÞÞÞ B: @: œ! has only the trivial solution EB œ! has only the trivial solution Ðwhere E œ Ò@ @ ÞÞÞ@ ÓÑ every column in E is a pivot column E has linearly

More information

Information and Entropy

Information and Entropy Information and Entropy Shannon s Separation Principle Source Coding Principles Entropy Variable Length Codes Huffman Codes Joint Sources Arithmetic Codes Adaptive Codes Thomas Wiegand: Digital Image Communication

More information

Basic information theory

Basic information theory Basic information theory Communication system performance is limited by Available signal power Background noise Bandwidth limits. Can we postulate an ideal system based on physical principles, against

More information

Math 1AA3/1ZB3 Sample Test 3, Version #1

Math 1AA3/1ZB3 Sample Test 3, Version #1 Math 1AA3/1ZB3 Sample Test 3, Version 1 Name: (Last Name) (First Name) Student Number: Tutorial Number: This test consists of 16 multiple choice questions worth 1 mark each (no part marks), and 1 question

More information

Example Let VœÖÐBßCÑ À b-, CœB - ( see the example above). Explain why

Example Let VœÖÐBßCÑ À b-, CœB - ( see the example above). Explain why Definition If V is a relation from E to F, then a) the domain of V œ dom ÐVÑ œ Ö+ E À b, F such that Ð+ß,Ñ V b) the range of V œ ran( VÑ œ Ö, F À b+ E such that Ð+ß,Ñ V " c) the inverse relation of V œ

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

E303: Communication Systems

E303: Communication Systems E303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London Principles of PCM Prof. A. Manikas (Imperial College) E303: Principles of PCM v.17

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

: œ Ö: =? À =ß> real numbers. œ the previous plane with each point translated by : Ðfor example,! is translated to :)

: œ Ö: =? À =ß> real numbers. œ the previous plane with each point translated by : Ðfor example,! is translated to :) â SpanÖ?ß@ œ Ö =? > @ À =ß> real numbers : SpanÖ?ß@ œ Ö: =? > @ À =ß> real numbers œ the previous plane with each point translated by : Ðfor example, is translated to :) á In general: Adding a vector :

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

Modern Digital Communication Techniques Prof. Suvra Sekhar Das G. S. Sanyal School of Telecommunication Indian Institute of Technology, Kharagpur

Modern Digital Communication Techniques Prof. Suvra Sekhar Das G. S. Sanyal School of Telecommunication Indian Institute of Technology, Kharagpur Modern Digital Communication Techniques Prof. Suvra Sekhar Das G. S. Sanyal School of Telecommunication Indian Institute of Technology, Kharagpur Lecture - 15 Analog to Digital Conversion Welcome to the

More information

Lecture 5b: Line Codes

Lecture 5b: Line Codes Lecture 5b: Line Codes Dr. Mohammed Hawa Electrical Engineering Department University of Jordan EE421: Communications I Digitization Sampling (discrete analog signal). Quantization (quantized discrete

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

PART I. Multiple choice. 1. Find the slope of the line shown here. 2. Find the slope of the line with equation $ÐB CÑœ(B &.

PART I. Multiple choice. 1. Find the slope of the line shown here. 2. Find the slope of the line with equation $ÐB CÑœ(B &. Math 1301 - College Algebra Final Exam Review Sheet Version X This review, while fairly comprehensive, should not be the only material used to study for the final exam. It should not be considered a preview

More information

Source Coding: Part I of Fundamentals of Source and Video Coding

Source Coding: Part I of Fundamentals of Source and Video Coding Foundations and Trends R in sample Vol. 1, No 1 (2011) 1 217 c 2011 Thomas Wiegand and Heiko Schwarz DOI: xxxxxx Source Coding: Part I of Fundamentals of Source and Video Coding Thomas Wiegand 1 and Heiko

More information

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10 Digital Band-pass Modulation PROF. MICHAEL TSAI 211/11/1 Band-pass Signal Representation a t g t General form: 2πf c t + φ t g t = a t cos 2πf c t + φ t Envelope Phase Envelope is always non-negative,

More information

e) D œ < f) D œ < b) A parametric equation for the line passing through Ð %, &,) ) and (#,(, %Ñ.

e) D œ < f) D œ < b) A parametric equation for the line passing through Ð %, &,) ) and (#,(, %Ñ. Page 1 Calculus III : Bonus Problems: Set 1 Grade /42 Name Due at Exam 1 6/29/2018 1. (2 points) Give the equations for the following geometrical objects : a) A sphere of radius & centered at the point

More information

TSKS01 Digital Communication Lecture 1

TSKS01 Digital Communication Lecture 1 TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur Module ntroduction to Digital Communications and nformation Theory Lesson 3 nformation Theoretic Approach to Digital Communications After reading this lesson, you will learn about Scope of nformation Theory

More information

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN A Thesis Presented to The Academic Faculty by Bryan Larish In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

More information

Digital Transmission Methods S

Digital Transmission Methods S Digital ransmission ethods S-7.5 Second Exercise Session Hypothesis esting Decision aking Gram-Schmidt method Detection.K.K. Communication Laboratory 5//6 Konstantinos.koufos@tkk.fi Exercise We assume

More information

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

Dr. H. Joseph Straight SUNY Fredonia Smokin' Joe's Catalog of Groups: Direct Products and Semi-direct Products

Dr. H. Joseph Straight SUNY Fredonia Smokin' Joe's Catalog of Groups: Direct Products and Semi-direct Products Dr. H. Joseph Straight SUNY Fredonia Smokin' Joe's Catalog of Groups: Direct Products and Semi-direct Products One of the fundamental problems in group theory is to catalog all the groups of some given

More information

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design Chapter 4 Receiver Design Chapter 4 Receiver Design Probability of Bit Error Pages 124-149 149 Probability of Bit Error The low pass filtered and sampled PAM signal results in an expression for the probability

More information

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0 Part II Information Theory Concepts Chapter 2 Source Models and Entropy Any information-generating process can be viewed as a source: { emitting a sequence of symbols { symbols from a nite alphabet text:

More information

Coding theory: Applications

Coding theory: Applications INF 244 a) Textbook: Lin and Costello b) Lectures (Tu+Th 12.15-14) covering roughly Chapters 1,9-12, and 14-18 c) Weekly exercises: For your convenience d) Mandatory problem: Programming project (counts

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Basic Principles of Video Coding

Basic Principles of Video Coding Basic Principles of Video Coding Introduction Categories of Video Coding Schemes Information Theory Overview of Video Coding Techniques Predictive coding Transform coding Quantization Entropy coding Motion

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Relations. Relations occur all the time in mathematics. For example, there are many relations between # and % À

Relations. Relations occur all the time in mathematics. For example, there are many relations between # and % À Relations Relations occur all the time in mathematics. For example, there are many relations between and % À Ÿ% % Á% l% Ÿ,, Á ß and l are examples of relations which might or might not hold between two

More information

Uncertainity, Information, and Entropy

Uncertainity, Information, and Entropy Uncertainity, Information, and Entropy Probabilistic experiment involves the observation of the output emitted by a discrete source during every unit of time. The source output is modeled as a discrete

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin, PhD Shanghai Jiao Tong University Chapter 4: Analog-to-Digital Conversion Textbook: 7.1 7.4 2010/2011 Meixia Tao @ SJTU 1 Outline Analog signal Sampling Quantization

More information

Probability basics. Probability and Measure Theory. Probability = mathematical analysis

Probability basics. Probability and Measure Theory. Probability = mathematical analysis MA 751 Probability basics Probability and Measure Theory 1. Basics of probability 2 aspects of probability: Probability = mathematical analysis Probability = common sense Probability basics A set-up of

More information

BAYESIAN ANALYSIS OF NONHOMOGENEOUS MARKOV CHAINS: APPLICATION TO MENTAL HEALTH DATA

BAYESIAN ANALYSIS OF NONHOMOGENEOUS MARKOV CHAINS: APPLICATION TO MENTAL HEALTH DATA BAYESIAN ANALYSIS OF NONHOMOGENEOUS MARKOV CHAINS: APPLICATION TO MENTAL HEALTH DATA " # $ Minje Sung, Refik Soyer, Nguyen Nhan " Department of Biostatistics and Bioinformatics, Box 3454, Duke University

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

PCM Reference Chapter 12.1, Communication Systems, Carlson. PCM.1

PCM Reference Chapter 12.1, Communication Systems, Carlson. PCM.1 PCM Reference Chapter 1.1, Communication Systems, Carlson. PCM.1 Pulse-code modulation (PCM) Pulse modulations use discrete time samples of analog signals the transmission is composed of analog information

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction to Information Theory By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction [B.P. Lathi] Almost in all the means of communication, none produces error-free communication.

More information

Math 131 Exam 4 (Final Exam) F04M

Math 131 Exam 4 (Final Exam) F04M Math 3 Exam 4 (Final Exam) F04M3.4. Name ID Number The exam consists of 8 multiple choice questions (5 points each) and 0 true/false questions ( point each), for a total of 00 points. Mark the correct

More information

Shannon Information Theory

Shannon Information Theory Chapter 3 Shannon Information Theory The information theory established by Shannon in 948 is the foundation discipline for communication systems, showing the potentialities and fundamental bounds of coding.

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding Ch 0 Introduction 0.1 Overview of Information Theory and Coding Overview The information theory was founded by Shannon in 1948. This theory is for transmission (communication system) or recording (storage

More information

at Some sort of quantization is necessary to represent continuous signals in digital form

at Some sort of quantization is necessary to represent continuous signals in digital form Quantization at Some sort of quantization is necessary to represent continuous signals in digital form x(n 1,n ) x(t 1,tt ) D Sampler Quantizer x q (n 1,nn ) Digitizer (A/D) Quantization is also used for

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

EE5713 : Advanced Digital Communications

EE5713 : Advanced Digital Communications EE5713 : Advanced Digital Communications Week 12, 13: Inter Symbol Interference (ISI) Nyquist Criteria for ISI Pulse Shaping and Raised-Cosine Filter Eye Pattern Equalization (On Board) 20-May-15 Muhammad

More information

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral

More information

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7 ECS 332: Principles of Communications 2012/1 HW 4 Due: Sep 7 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will

More information

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2006 Issued: March 4, 2006

More information

CSE468 Information Conflict

CSE468 Information Conflict CSE468 Information Conflict Lecturer: Dr Carlo Kopp, MIEEE, MAIAA, PEng Lecture 02 Introduction to Information Theory Concepts Reference Sources and Bibliography There is an abundance of websites and publications

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

E303: Communication Systems

E303: Communication Systems E303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London An Overview of Fundamentals: PN-codes/signals & Spread Spectrum (Part B) Prof. A. Manikas

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Information Theory and Coding Techniques

Information Theory and Coding Techniques Information Theory and Coding Techniques Lecture 1.2: Introduction and Course Outlines Information Theory 1 Information Theory and Coding Techniques Prof. Ja-Ling Wu Department of Computer Science and

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2005 Issued: March 7, 2005

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, MAY 9, Examiners: Drs. K. M. Ramachandran and G. S. Ladde

QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, MAY 9, Examiners: Drs. K. M. Ramachandran and G. S. Ladde QUALIFYING EXAMINATION II IN MATHEMATICAL STATISTICS SATURDAY, MAY 9, 2015 Examiners: Drs. K. M. Ramachandran and G. S. Ladde INSTRUCTIONS: a. The quality is more important than the quantity. b. Attempt

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

MAP METHODS FOR MACHINE LEARNING. Mark A. Kon, Boston University Leszek Plaskota, Warsaw University, Warsaw Andrzej Przybyszewski, McGill University

MAP METHODS FOR MACHINE LEARNING. Mark A. Kon, Boston University Leszek Plaskota, Warsaw University, Warsaw Andrzej Przybyszewski, McGill University MAP METHODS FOR MACHINE LEARNING Mark A. Kon, Boston University Leszek Plaskota, Warsaw University, Warsaw Andrzej Przybyszewski, McGill University Example: Control System 1. Paradigm: Machine learning

More information

Compression and Coding

Compression and Coding Compression and Coding Theory and Applications Part 1: Fundamentals Gloria Menegaz 1 Transmitter (Encoder) What is the problem? Receiver (Decoder) Transformation information unit Channel Ordering (significance)

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Information Theory. Week 4 Compressing streams. Iain Murray,

Information Theory. Week 4 Compressing streams. Iain Murray, Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 4 Compressing streams Iain Murray, 2014 School of Informatics, University of Edinburgh Jensen s inequality For convex functions: E[f(x)]

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview : Markku Juntti Overview The basic ideas and concepts of information theory are introduced. Some historical notes are made and the overview of the course is given. Source The material is mainly based on

More information

Soft-Output Trellis Waveform Coding

Soft-Output Trellis Waveform Coding Soft-Output Trellis Waveform Coding Tariq Haddad and Abbas Yongaçoḡlu School of Information Technology and Engineering, University of Ottawa Ottawa, Ontario, K1N 6N5, Canada Fax: +1 (613) 562 5175 thaddad@site.uottawa.ca

More information

Here are proofs for some of the results about diagonalization that were presented without proof in class.

Here are proofs for some of the results about diagonalization that were presented without proof in class. Suppose E is an 8 8 matrix. In what follows, terms like eigenvectors, eigenvalues, and eigenspaces all refer to the matrix E. Here are proofs for some of the results about diagonalization that were presented

More information

A General Procedure to Design Good Codes at a Target BER

A General Procedure to Design Good Codes at a Target BER A General Procedure to Design Good odes at a Target BER Speaker: Xiao Ma 1 maxiao@mail.sysu.edu.cn Joint work with: hulong Liang 1, Qiutao Zhuang 1, and Baoming Bai 2 1 Dept. Electronics and omm. Eng.,

More information

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g Exercise Generator polynomials of a convolutional code, given in binary form, are g 0, g 2 0 ja g 3. a) Sketch the encoding circuit. b) Sketch the state diagram. c) Find the transfer function TD. d) What

More information

7.1 Sampling and Reconstruction

7.1 Sampling and Reconstruction Haberlesme Sistemlerine Giris (ELE 361) 6 Agustos 2017 TOBB Ekonomi ve Teknoloji Universitesi, Guz 2017-18 Dr. A. Melda Yuksel Turgut & Tolga Girici Lecture Notes Chapter 7 Analog to Digital Conversion

More information

SIMULATION - PROBLEM SET 1

SIMULATION - PROBLEM SET 1 SIMULATION - PROBLEM SET 1 " if! Ÿ B Ÿ 1. The random variable X has probability density function 0ÐBÑ œ " $ if Ÿ B Ÿ.! otherwise Using the inverse transform method of simulation, find the random observation

More information

solve EB œ,, we can row reduce the augmented matrix

solve EB œ,, we can row reduce the augmented matrix PY Decomposition: Factoring E œ P Y Motivation To solve EB œ,, we can row reduce the augmented matrix Ò + + ÞÞÞ+ ÞÞÞ + l, Ó µ ÞÞÞ µ Ò- - ÞÞÞ- ÞÞÞ-. Ó " # 3 8 " # 3 8 When we get to Ò-" -# ÞÞÞ -3 ÞÞÞ-8Ó

More information

Principles of Communications

Principles of Communications Principles of Communications Chapter V: Representation and Transmission of Baseband Digital Signal Yongchao Wang Email: ychwang@mail.xidian.edu.cn Xidian University State Key Lab. on ISN November 18, 2012

More information

Dr. Nestler - Math 2 - Ch 3: Polynomial and Rational Functions

Dr. Nestler - Math 2 - Ch 3: Polynomial and Rational Functions Dr. Nestler - Math 2 - Ch 3: Polynomial and Rational Functions 3.1 - Polynomial Functions We have studied linear functions and quadratic functions Defn. A monomial or power function is a function of the

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman

More information

Chapter 10 Applications in Communications

Chapter 10 Applications in Communications Chapter 10 Applications in Communications School of Information Science and Engineering, SDU. 1/ 47 Introduction Some methods for digitizing analog waveforms: Pulse-code modulation (PCM) Differential PCM

More information

that efficiently utilizes the total available channel bandwidth W.

that efficiently utilizes the total available channel bandwidth W. Signal Design for Band-Limited Channels Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University Introduction We consider the problem of signal

More information