One Lesson of Information Theory

Size: px
Start display at page:

Download "One Lesson of Information Theory"

Transcription

1 Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany September 2010 Volker Kühn - One Lesson of Information Theory UNIVERSITÄT ROSTOCK FAKULTÄT INFORMATIK UND ELEKTROTECHNIK

2 Outline of Lectures Lesson 1: One Lesson of Information Theory Principle structure of communication systems Definitions of entropy, mutual information, Channel coding theorem of Shannon Lesson 2: Introduction to Error Correcting Codes Basics of error correcting codes Linear block codes Convolutional codes (if time permits) Lesson 3: State-of-the-art channel coding Coding strategies to approach the capacity limits Definition of soft-information and turbo decoding principle Examples for state-of-the-art error correcting codes 2

3 Literature Lin/ Costello: Error Contol Coding: Fundamentals and Applications Bossert: Channel Coding Johannesson/Zigangirov: Fundamentals of Convolutional Codes Richardson, Urbanke: Modern Coding Theory Neubauer, Freudenberger, Kühn: Coding Theory Algorithms, Architectures, and Applications Johannesson: Information Theory Cover, Thomas: Elements of Information Theory 3

4 Outline of Lectures Lesson 1: One Lesson of Information Theory Principle structure of communication systems Definitions of entropy, mutual information, Channel coding theorem of Shannon Lesson 2: Introduction to Error Correcting Codes Basics of error correcting codes Linear block codes Convolutional codes (if time permits) Lesson 3: State-of-the-art channel coding Coding strategies to approach the capacity limits Definition of soft-information and turbo decoding principle Examples for state-of-the-art error correcting codes 4

5 Principle Structure of Digital Communication System analog source source encoder digital source u k Source generates analog signal (e.g. voice, video) Source coding samples, quantizes and compresses analog signal Digital Source: comprises analog source and source coding, delivers digital data vector u of length k 5

6 Principle Structure of Digital Communication System analog source source encoder digital source u k channel encoder x n Channel encoder adds redundancy to u resulting in code word x of length n Channel encoder may consist of several constituent codes Code rate: R c = k / n 6

7 Principle Structure of Digital Communication System analog source source encoder digital source Modulator maps discrete vector x onto analog waveform and moves it into the transmission band Physical channel represents transmission medium Multipath propagation intersymbol interference (ISI) Time varying fading, i.e. deep fades Additive noise Demodulator: Moves signal back into baseband and performs lowpass filtering, sampling, quantization u channel encoder R c = k/n x y modulator demodulator physical channel time-discrete channel Time-discrete channel: comprises analog part of modulator, physical channel and analog part of demodulator 7

8 Principle Structure of Digital Communication System analog source source encoder digital source u channel encoder R c = k/n x modulator physical channel u channel decoder y demodulator Channel decoder: Estimation of u on the basis of received vector y y need not to consist of hard quantized values {0,1} Since encoder may consist of several parts, decoder may also consist of several modules time-discrete channel 8

9 Principle Structure of Digital Communication System analog source source encoder digital source source sink decoder digital sink u x channel encoder R c = k/n û y channel decoder modulator demodulator physical channel time-discrete channel Citation of Massey: The purpose of the modulation system is to create a good discrete channel from the modulator input to the demodulator output, and the purpose of the coding system is to transmit the information bits reliably through this discrete channel at the highest practicable rate. 9

10 Time-Discrete Channel Time-discrete channel comprises analog parts of modulator and demodulator as well as physical transmission medium x i X discrete input alphabet X = {X 0,..., X X 1 } discrete channel y i Y discrete or continuous output alphabets Y = {Y 0,..., Y Y 1 } Y = R Probabilities, probability densities: Joint probability of event: Conditional probabilities: A posteriori probabilities: Pr{X ν }, Pr{Y μ } Pr{X ν, Y μ } Pr{Y μ X ν } Pr{X ν Y μ } p(y) p(x = X ν, y) p(y x = X ν ) Pr{X ν y} 10

11 AWGN: Additive White Gaussian Noise x i n i y i p(y x = X ν ) = 1 p e (y Xν ) 2σ N 2 2πσ 2 N signal-to-noise-ratio E s /N 0 : 2 db 0.8 signal-to-noise-ratio E s /N 0 : 6 db 0.6 p(y x = 1) p(y x = +1) 0.6 p(y x = 1) p(y x = +1) p(y) p(y) y y 11

12 Error Probability of AWGN Channel and BPSK Complementary error function: P s = 1 π Z p Es /N 0 e ξ 2 decision threshold Y 0 Y 1 dξ = 1 2 erfc Ãr Es N 0! X 0 = -1 X 1 = +1 12

13 Transition to Discrete Channels Discrete channels arise from quantization of continuous channel output We consider binary antipodal transmission: X = {X 0, X 1 } = {+1, 1} Generally continuously distributed channel output: Y = R L-bit quantization due to finite precision of digital circuits delivers alphabet Y = {Y 0,..., Y 2 L 1} L = 1: Hard-Decision: Y = {Y 0, Y 1 } = {+1, 1} = X L = 2: four output symbols: Y = {Y 0, Y 1, Y 2, Y 3 } L = 3: eight output symbols: Y = {Y 0, Y 1, Y 2, Y 3, Y 4, Y 5, Y 6, Y 7 } 13

14 Discrete Channels (1) Binary Symmetric Channel (BSC) from hard decision (L = 2) Y = {Y 0, Y 1 } = {+1, 1} X 0 = -1 X 1 = +1 Y 0 Y 1 X 0 X 1 1-P e Y 0 P e 1-P e P e Y 1 Ãr! P e = 1 2 erfc Es N 0 14

15 Discrete Channels (2) Binary Symmetric Erasure Channel (BSEC) X 0 P e 1-P e -P q P q P e Y 0 Y 2 P X q 1 Y 1-P e -P 1 q X 0 = -1 X 1 = +1 -a +a Y 0 Y 2 Y 1 15

16 Discrete Channels (3) 2-Bit-Quantization -a 0 +a Y 0 Y 0 Y 1 Y 2 Y 3 X 0 Y 1 Y 2 X 1 Y 3 16

17 Outline of Lectures Lesson 1: One Lesson of Information Theory Principle structure of communication systems Definitions of entropy, mutual information, Channel coding theorem of Shannon Lesson 2: Introduction to Error Correcting Codes Basics of error correcting codes Linear block codes Convolutional codes (if time permits) Lesson 3: State-of-the-art channel coding Coding strategies to approach the capacity limits Definition of soft-information and turbo decoding principle Examples for state-of-the-art error correcting codes 17

18 Information, Entropy Amount of information should depend on probability: For independent events: I(X ν ) = f(pr{x ν }) Pr{X ν, Y μ } = Pr{X ν } Pr{Y μ } I(X ν, Y μ ) = I(X ν ) + I(Y μ ) Logarithm is sole function that maps product onto a sum µ 1 I(X ν ) = log 2 (Pr{X ν }) = log 2 0 Pr(X ν ) Entropy: H(X) = P ν Pr{X ν} log 2 (Pr{X ν }) = E log 2 (Pr{X}) ª Entropy is a measure of uncertainty 18

19 Examples for Entropy Set of events: X = X 1, X 2, X 3, X 4, X 5 ª Each event occurs with certain probability Pr{X 1 } = 0.30 I(X 1 ) = Pr{X 2 } = 0.20 I(X 1 ) = Pr{X 3 } = 0.20 I(X 1 ) = Pr{X 4 } = 0.15 I(X 1 ) = Pr{X 5 } = 0.15 I(X 1 ) = H(X) = X ν Pr{X ν} log 2 (Pr{X ν }) = bit Entropy of a set is maximized, when all M elements are equally likely max H(X) = H equal(x) = X M 1 Pr{X} ν=0 1 M log 2(M) = log 2 (M) bit = 2, 32 bit 19

20 Example: LCD for 10 digits a b c d e f g digit a b c d e f g All digits with same probability: Amount of information per digit: Entropy of alphabet: Absolute redundancy: Relative redundancy: Pr{X ν } = 0.1 I(X ν ) = log 2 (Pr{X ν }) = log 2 (10) = 3.32 bit H(X) = P ν Pr{X ν} I(X ν ) = 3.32 bit R = m H(X) = 7 bit 3.32 bit = 3.68 bit r = R/m = 3.68 bit/7 bit = 52.54% 20

21 Binary Entropy Function Set of events: X = X 1, X 2, ª Event probabilities: Pr{X 1 } = P 1 Pr{X 2 } = 1 P 1 H(X) = H 2 (P 1 ) = P 1 log 2 (P 1 ) (1 P 1 ) log 2 (1 P 1 ) P

22 Illumination of Entropies H(X) H(Y) H(X Y) H(Y X) H(X; Y) H(X, Y) H(X), H(Y) H(X, Y) H(X Y) H(Y X) H(X; Y) entropies of source and sink alphabet entropy of sink alphabet joint entropy of source and sink equivocation: information lost during transmission irrelevance, information originating not from source mutual information: information correctly received at sink 22

23 Joint Entropy, Equivocation, Irrelevance Joint Information I(X ν, Y μ ) = log 2 Pr{Xν, Y μ } Joint Entropy of source and sink: H(X, Y) = X ν Xμ Pr{X ν, Y μ } log 2 Pr{Xν, Y μ } = E log 2 Pr{Xν, Y μ } ª Equivocation: Information lost during transmission Xμ Pr{X ν, Y μ } log 2 Pr{Xν Y μ } H(X Y) = H(X, Y) H(Y) = X ν = E log 2 Pr{Xν Y μ } ª Irrelevance Xμ Pr{X ν, Y μ } log 2 Pr{Yμ X ν } H(Y X) = H(X, Y) H(X) = X ν = E log 2 Pr{Yμ X ν } ª 23

24 Mutual Information Definition of Mutual Information H(X; Y) = H(X) H(X Y) = H(Y) H(Y X) = H(X) + H(Y) H(X, Y) = X X µ Pr{Y Pr{Yμ X ν } μ X ν } Pr{X ν } log 2 ν μ Pr{Y μ } ½ µ ¾ Pr{Yμ X ν } = E log 2 Pr{Y μ } ½ µ ¾ Pr{Yμ, X ν } = E log 2 Pr{Y μ } Pr{X ν } Mutual information is the amount of information common to X and Y Mutual information is the reduction of uncertainty in X due to the knowledge of Y 24

25 Illustration of Channel Capacity H(X Y) equivocation H(X) H(X; Y) H(Y) H(Y X) irrelevance Maximization of mutual information with respect to source statistics delivers channel capacity: X X C = sup Pr{X} ν μ Pr{Y μ X ν } Pr{X ν } log 2 Pr{Y μ X ν } Pr{Y μ } 25

26 Outline of Lectures Lesson 1: One Lesson of Information Theory Principle structure of communication systems Definitions of entropy, mutual information, Channel coding theorem of Shannon Lesson 2: Introduction to Error Correcting Codes Basics of error correcting codes Linear block codes Convolutional codes (if time permits) Lesson 3: State-of-the-art channel coding Coding strategies to approach the capacity limits Definition of soft-information and turbo decoding principle Examples for state-of-the-art error correcting codes 26

27 Channel Coding Theorem of Shannon Shannon, 1948: A Mathematical Theory of Communication If a channel has the capacity C, there exist a code with rate R c C for which the probability of a decoding error can be made arbitrary small. Converse Theorem: If a channel has the capacity C, a reliable (error-free) communication cannot be achieved for codes with rates R c > C. Theorems are not constructive, i.e. they do not provide a construction guideline for powerful codes 27

28 Capacity of Binary Channels Statistics of channel X 0 X Mutual information Y 0 Y 1 Pr ( ª P 0 for μ = 0 X ν = 1 P 0 for μ = 1 Pr ( ª 1 for μ = ν Y μ X ν = 0 for μ 6=ν Pr ( ª P 0 for μ = 0 Y μ = 1 P 0 for μ = 1 H(X; Y) = P 0 log 2 1 P 0 + (1 P 0 ) log P 0 = H 2 (P 0 ) = H(X) Hint: 0 log 2 (0) = 0 Perfect transmission without any errors! 28

29 Capacity of Binary Channels Statistics of channel X 0 X Mutual information Y 0 Y 1 Pr ( ª P 0 for μ = 0 X ν = 1 P 0 for μ = 1 Pr ( ª 0 for μ = ν Y μ X ν = 1 for μ 6=ν Pr ( ª 1 P 0 for μ = 0 Y μ = P 0 for μ = 1 H(X; Y) = P 0 log 2 1 P 0 + (1 P 0 ) log P 0 = H 2 (P 0 ) = H(X) Hint: 0 log 2 (0) = 0 Perfect transmission without any errors! 29

30 Capacity of Binary Erasure Channel Statistics of BEC channel X 0 1-P e Y 0 P e Y 2 X 1 Y 1 1-P e Mutual information of BEC Pr ( ª P 0 for μ = 0 X ν = 1 P 0 for μ = 1 Pr ª P 0 (1 P e ) for μ = 0 Y μ = P e (P P 0 ) = P e for μ = 2 (1 P 0 ) (1 P e ) for μ = 1 I(X; Y) = (1 P e )P 0 log 2 1 P e P 0 (1 P e ) + P ep 0 log 2 P e P e (P P 0 ) + (1 P e )(1 P 0 ) log 2 1 P e (1 P 0 )(1 P e ) + P e(1 P 0 ) log 2 P e P e = (1 P e ) H 2 (P 0 ) 30

31 Capacity of Binary Erasure Channel Mutual information of BEC and different statistics of input signal 1 Pr(X 0 )= Pr(X 0 )=0.2 Pr(X 0 )= Pr(X 0 )=0.4 Pr(X )= P e Capacity of BEC for uniform input distribution: C BEC = (1 P e ) 31

32 Capacity of Binary Symmetric Channel Statistics of BSC channel for uniform input distribution X 0 X 1 1-P e 1-P e P e P e Y 0 Y 1 Mutual information of BSC Pr ª ª 1 X 0 = Pr X1 = 2 Pr ( ª 1 P e for μ = ν Y μ X ν = P e for μ 6=ν Pr Y 0 ª = Pr Y1 ª = 1 2 C BSC = 2 (1 P e ) 1 2 log 2 2(1 Pe ) + 2 P e 1 2 log 2 2Pe = (1 P e ) 1 + log 2 (1 P e ) + P e 1 + log 2 (P e ) = 1 + (1 P e ) log 2 (1 P e ) + P e log 2 (P e ) = 1 H 2 (P e ) 32

33 Capacity of Binary Symmetric Channel Mutual information of BSC and different statistics of input signal Pr{X 0 } = 0.1 Pr{X 0 } = 0.3 Pr{X 0 } = 0.5 C(p e ) p e Capacity of BSC for uniform input distribution C BSC = 1 + P e log 2 (P e ) + (1 P e ) log 2 (1 P e ) = 1 H 2 (P e ) 33

34 Binary Symmetric Erasure Channel (BSEC) Quantization parameter a has to be optimized with respect to channel capacity C Optimal choice depends on signal-to-noise-ratio Es/N0 X 0 1-P e -P q P e Y 0 P q Y 2 X 1 1-P e -P q Y 1 X 0 = -1 X 1 = +1 -a +a Y 0 Y 1 Y 2 C BSEC = 1 P q +P e log 2 (P e )+(1 P e P q ) log 2 (1 P e P q ) (1 P q ) log 2 (1 P q ) 34

35 Channel Capacity for BSC and BSEC BSEC C BSEC, a=opt. C BSC a opt 0.2 E s /N 0 in db E s /N 0 in db a > 1 leads only to minor improvement of channel capacity 35

36 Capacity of AWGN Channel Additive White Gaussian Noise Channel n N (0, σ 2 N ) x N (0, σ 2 X ) y N (0, σ2 Y ) Differential entropy of Gaussian random process Z h(x) = p X (ξ) log 2 p X (ξ) dξ = 1 2 log 2(2πeσX) 2 Capacity of AWGN channel C = h(y) h(y X) = h(y) h(n) = 1 2 log 2 2πe(σ 2 X + σn 2 ) 1 2 log 2(2πeσN 2 ) = 1 2 log σ 2 X /σ 2 N 36

37 Channel Capacity of BPSK and AWGN Influence of quantization C 0.4 q = q = q = 2 q = 3 gauss E S / N0 in db 37

38 Ultimate Communication Limit Energy per information bit: E b = E s / C E s = C E b Capacity of 1-D AWGN channel C = 1 µ 2 log Es N 0 = 1 µ 2 log C Eb N 0 Minimum signal to noise ratio E b = 22C 1 N 0 2C C db ln(2) C Eb / N0 in db db 38

39 Institut für Thanks for your attention! September 2010 Volker Kühn - One Lesson of Information Theory UNIVERSITÄT ROSTOCK FAKULTÄT INFORMATIK UND ELEKTROTECHNIK

State-of-the-Art Channel Coding

State-of-the-Art Channel Coding Institut für State-of-the-Art Channel Coding Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung

Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung Dr.-Ing. Carsten Bockelmann Institute for Telecommunications and High-Frequency Techniques Department of Communications

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Coding theory: Applications

Coding theory: Applications INF 244 a) Textbook: Lin and Costello b) Lectures (Tu+Th 12.15-14) covering roughly Chapters 1,9-12, and 14-18 c) Weekly exercises: For your convenience d) Mandatory problem: Programming project (counts

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

Channel Coding 1. Sportturm (SpT), Room: C3165

Channel Coding 1.   Sportturm (SpT), Room: C3165 Channel Coding Dr.-Ing. Dirk Wübben Institute for Telecommunications and High-Frequency Techniques Department of Communications Engineering Room: N3, Phone: 4/8-6385 Sportturm (SpT), Room: C365 wuebben@ant.uni-bremen.de

More information

Shannon Information Theory

Shannon Information Theory Chapter 3 Shannon Information Theory The information theory established by Shannon in 948 is the foundation discipline for communication systems, showing the potentialities and fundamental bounds of coding.

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Introduction to Low-Density Parity Check Codes. Brian Kurkoski Introduction to Low-Density Parity Check Codes Brian Kurkoski kurkoski@ice.uec.ac.jp Outline: Low Density Parity Check Codes Review block codes History Low Density Parity Check Codes Gallager s LDPC code

More information

Chapter 7: Channel coding:convolutional codes

Chapter 7: Channel coding:convolutional codes Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication

More information

Bounds on Mutual Information for Simple Codes Using Information Combining

Bounds on Mutual Information for Simple Codes Using Information Combining ACCEPTED FOR PUBLICATION IN ANNALS OF TELECOMM., SPECIAL ISSUE 3RD INT. SYMP. TURBO CODES, 003. FINAL VERSION, AUGUST 004. Bounds on Mutual Information for Simple Codes Using Information Combining Ingmar

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

PCM Reference Chapter 12.1, Communication Systems, Carlson. PCM.1

PCM Reference Chapter 12.1, Communication Systems, Carlson. PCM.1 PCM Reference Chapter 1.1, Communication Systems, Carlson. PCM.1 Pulse-code modulation (PCM) Pulse modulations use discrete time samples of analog signals the transmission is composed of analog information

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Principles of Coded Modulation. Georg Böcherer

Principles of Coded Modulation. Georg Böcherer Principles of Coded Modulation Georg Böcherer Contents. Introduction 9 2. Digital Communication System 2.. Transmission System............................. 2.2. Figures of Merit................................

More information

Digital Modulation 1

Digital Modulation 1 Digital Modulation 1 Lecture Notes Ingmar Land and Bernard H. Fleury Navigation and Communications () Department of Electronic Systems Aalborg University, DK Version: February 5, 27 i Contents I Basic

More information

Digital Transmission Methods S

Digital Transmission Methods S Digital ransmission ethods S-7.5 Second Exercise Session Hypothesis esting Decision aking Gram-Schmidt method Detection.K.K. Communication Laboratory 5//6 Konstantinos.koufos@tkk.fi Exercise We assume

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

A Relation between Conditional and Unconditional Soft Bit Densities of Binary Input Memoryless Symmetric Channels

A Relation between Conditional and Unconditional Soft Bit Densities of Binary Input Memoryless Symmetric Channels A Relation between Conditional and Unconditional Soft Bit Densities of Binary Input Memoryless Symmetric Channels Wolfgang Rave Vodafone Chair Mobile Communications Systems, Technische Universität Dresden

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information

A Systematic Description of Source Significance Information

A Systematic Description of Source Significance Information A Systematic Description of Source Significance Information Norbert Goertz Institute for Digital Communications School of Engineering and Electronics The University of Edinburgh Mayfield Rd., Edinburgh

More information

Chapter 4: Continuous channel and its capacity

Chapter 4: Continuous channel and its capacity meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Optimum Soft Decision Decoding of Linear Block Codes

Optimum Soft Decision Decoding of Linear Block Codes Optimum Soft Decision Decoding of Linear Block Codes {m i } Channel encoder C=(C n-1,,c 0 ) BPSK S(t) (n,k,d) linear modulator block code Optimal receiver AWGN Assume that [n,k,d] linear block code C is

More information

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals Information Sources Professor A. Manikas Imperial College London EE303 - Communication Systems An Overview of Fundamentals Prof. A. Manikas (Imperial College) EE303: Information Sources 24 Oct. 2011 1

More information

Constellation Shaping for Communication Channels with Quantized Outputs

Constellation Shaping for Communication Channels with Quantized Outputs Constellation Shaping for Communication Channels with Quantized Outputs, Dr. Matthew C. Valenti and Xingyu Xiang Lane Department of Computer Science and Electrical Engineering West Virginia University

More information

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

Soft-Output Trellis Waveform Coding

Soft-Output Trellis Waveform Coding Soft-Output Trellis Waveform Coding Tariq Haddad and Abbas Yongaçoḡlu School of Information Technology and Engineering, University of Ottawa Ottawa, Ontario, K1N 6N5, Canada Fax: +1 (613) 562 5175 thaddad@site.uottawa.ca

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Information Theory and Coding Techniques

Information Theory and Coding Techniques Information Theory and Coding Techniques Lecture 1.2: Introduction and Course Outlines Information Theory 1 Information Theory and Coding Techniques Prof. Ja-Ling Wu Department of Computer Science and

More information

Estimation of the Capacity of Multipath Infrared Channels

Estimation of the Capacity of Multipath Infrared Channels Estimation of the Capacity of Multipath Infrared Channels Jeffrey B. Carruthers Department of Electrical and Computer Engineering Boston University jbc@bu.edu Sachin Padma Department of Electrical and

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design Chapter 4 Receiver Design Chapter 4 Receiver Design Probability of Bit Error Pages 124-149 149 Probability of Bit Error The low pass filtered and sampled PAM signal results in an expression for the probability

More information

POLAR CODES FOR ERROR CORRECTION: ANALYSIS AND DECODING ALGORITHMS

POLAR CODES FOR ERROR CORRECTION: ANALYSIS AND DECODING ALGORITHMS ALMA MATER STUDIORUM UNIVERSITÀ DI BOLOGNA CAMPUS DI CESENA SCUOLA DI INGEGNERIA E ARCHITETTURA CORSO DI LAUREA MAGISTRALE IN INGEGNERIA ELETTRONICA E TELECOMUNICAZIONI PER L ENERGIA POLAR CODES FOR ERROR

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

One-Bit LDPC Message Passing Decoding Based on Maximization of Mutual Information

One-Bit LDPC Message Passing Decoding Based on Maximization of Mutual Information One-Bit LDPC Message Passing Decoding Based on Maximization of Mutual Information ZOU Sheng and Brian M. Kurkoski kurkoski@ice.uec.ac.jp University of Electro-Communications Tokyo, Japan University of

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Capacity of multiple-input multiple-output (MIMO) systems in wireless communications

Capacity of multiple-input multiple-output (MIMO) systems in wireless communications 15/11/02 Capacity of multiple-input multiple-output (MIMO) systems in wireless communications Bengt Holter Department of Telecommunications Norwegian University of Science and Technology 1 Outline 15/11/02

More information

Lecture 4 Capacity of Wireless Channels

Lecture 4 Capacity of Wireless Channels Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

The PPM Poisson Channel: Finite-Length Bounds and Code Design

The PPM Poisson Channel: Finite-Length Bounds and Code Design August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

3F1 Information Theory, Lecture 1

3F1 Information Theory, Lecture 1 3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

Shannon's Theory of Communication

Shannon's Theory of Communication Shannon's Theory of Communication An operational introduction 5 September 2014, Introduction to Information Systems Giovanni Sileno g.sileno@uva.nl Leibniz Center for Law University of Amsterdam Fundamental

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

EE5713 : Advanced Digital Communications

EE5713 : Advanced Digital Communications EE5713 : Advanced Digital Communications Week 12, 13: Inter Symbol Interference (ISI) Nyquist Criteria for ISI Pulse Shaping and Raised-Cosine Filter Eye Pattern Equalization (On Board) 20-May-15 Muhammad

More information

Maximum mutual information vector quantization of log-likelihood ratios for memory efficient HARQ implementations

Maximum mutual information vector quantization of log-likelihood ratios for memory efficient HARQ implementations Downloaded from orbit.dtu.dk on: Apr 29, 2018 Maximum mutual information vector quantization of log-likelihood ratios for memory efficient HARQ implementations Danieli, Matteo; Forchhammer, Søren; Andersen,

More information

Lecture 1. Introduction

Lecture 1. Introduction Lecture 1. Introduction What is the course about? Logistics Questionnaire Dr. Yao Xie, ECE587, Information Theory, Duke University What is information? Dr. Yao Xie, ECE587, Information Theory, Duke University

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly. (b) We note that the above capacity is the same to the capacity of the binary symmetric channel. Indeed, if we considerthe grouping of the output symbols into a = {y 1,y 2 } and b = {y 3,y 4 } we get a

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Lecture 9 Polar Coding

Lecture 9 Polar Coding Lecture 9 Polar Coding I-Hsiang ang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 29, 2015 1 / 25 I-Hsiang ang IT Lecture 9 In Pursuit of Shannon s Limit Since

More information

Superposition Mapping & Related Coding Techniques

Superposition Mapping & Related Coding Techniques Christian-Albrechts-University of Kiel Faculty of Engineering Superposition Mapping & Related Coding Techniques Tianbin Wo 2nd level v 1st level Kiel 2011 II Superposition Mapping & Related Coding Techniques

More information

The Turbo Principle in Wireless Communications

The Turbo Principle in Wireless Communications The Turbo Principle in Wireless Communications Joachim Hagenauer Institute for Communications Engineering () Munich University of Technology (TUM) D-80290 München, Germany Nordic Radio Symposium, Oulu,

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus Turbo Compression Andrej Rikovsky, Advisor: Pavol Hanus Abstract Turbo codes which performs very close to channel capacity in channel coding can be also used to obtain very efficient source coding schemes.

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

5. Density evolution. Density evolution 5-1

5. Density evolution. Density evolution 5-1 5. Density evolution Density evolution 5-1 Probabilistic analysis of message passing algorithms variable nodes factor nodes x1 a x i x2 a(x i ; x j ; x k ) x3 b x4 consider factor graph model G = (V ;

More information