Channel Coding 1. Sportturm (SpT), Room: C3165

Size: px
Start display at page:

Download "Channel Coding 1. Sportturm (SpT), Room: C3165"

Transcription

1 Channel Coding Dr.-Ing. Dirk Wübben Institute for Telecommunications and High-Frequency Techniques Department of Communications Engineering Room: N3, Phone: 4/ Sportturm (SpT), Room: C365 Lecture Monday, 8:3 : in S7 Eercise Wednesday, 5: 7: in N5 Dates for eercises will be announced during lectures. Tutor Shayan Hassanpour Sportturm Room: (SpT), N39 Room: C3 Phone hassanpour@ant.uni-bremen.de

2 Outline Channel Coding I. Introduction Declarations and definitions, general principle of channel coding Structure of digital communication systems. Introduction to Information Theory obabilities, measure of information SHANNON s channel capacity for different channels 3. Linear Block Codes operties of block codes and general decoding principles Bounds on error rate performance Representation of block codes with generator and parity check matrices Cyclic block codes (CRC-Code, Reed-Solomon and BCH codes) 4. Convolutional Codes Structure, algebraic and graphical presentation Distance properties and error rate performance Optimal decoding with Viterbi algorithm

3 Definitions Chapter. Information Theory Measure of information, Entropy Entropies of a communication system SHANNON s channel capacity for different channels Channel with discrete output alphabet Channel with continuous output alphabet Capacity for continuous input alphabet Capacity of bandlimited channel Gallager Eponent and Cut-off Rate Bhattacharyya bound for the error probability Gallager function and Gallager eponent Cut-Off Rate Appendi 3

4 DEFINITIONS 4

5 Basics of Information Theory Basic for the conception of all communication systems, established by C.E. Shannon in 948 Basic question: What is the maimum rate that can be transmitted over a given channel without errors? This maimum rate is known as channel capacity Let be a random variable taking values in the alphabet,,, with probability How to measure the amount of information of a symbol Amount of information should be non-negative and real: Amount of information should depend on probability: I X f X For independent events the common information corresponds to the sum of the individual contents: X Y X Y I X Y I X I Y,, I X I X Logarithm is sole function that maps product onto a sum, 5

6 Amount of information per digit: I X log log X Information, Entropy log bits (binary digits) log e nats (natural digits), log hartley. Symbols with small probability yield large. amount of information, whereas likely symbols contain only small amount of information {X } Entropy: Average amount of information of a symbol alphabet,, X E log X X log X X Average amount of information provided by an observation of X Our uncertainty about The randomness of E: appendi X log independent events Convention: log since lim log Note, the entropy is not a function of the random variable, but rather of the set and its probability mass function. 6

7 Entropy (Entropie) For a discrete set of elements the entropy fulfills oof: iff (if and only if) for one for all and log log for all log Using Jensen s inequality for concave function log u u Elog log E X log M M X X log log X log M X X Entropy is maimized, when all M elements are equally likely, i.e., : M ma X equal X log M M log M log M bit X M M Uniform distribution leads to maimum uncertainty most random variable M Jensen inequality for concave f() E E Rules for logarithm in appendi 7

8 Eample: Seven Segment Display c digit a a d f b c d b e g e f g All digits with same probability:. Amount of information per digit: log log 3.3 bit Entropy of alphabet: 3.3 bit Absolute redundancy: 7 bit 3.3 bit 3.68 bit Relative redundancy: / 3.68 bit / 7 bit 5.54% 8

9 Eample: Entropy of a binary Alphabet Given:, with and Entropy X E log X Binary entropy function Maimum entropy of a binary alphabet for equally likely symbols and.5 ma X p log ( p ) ( p ) log ( p ) ( p ) X X bit equal log bit (p) (p ma ) = p =. p =.89 p =.5 ma p 9

10 Given: Eample: Non-uniform Symbol Alphabet X X X X X 3 X 4 X 5 X 6 X 7 {X } / /4 /8 /6 /64 /64 /64 /64 bin. represent Entropy describes the minimum average number of bits to represent all alphabet symbols uniquely (entropy coding / source coding) X log log log log 4 log bits Binary representation: {,,,,,,, } Average description length 4 l E l l X bits Source coding / compression coding (e.g. Huffman coding) In contrast, for uniform probability 3 bits are required for representation!

11 Generation of optimal prefi-free code X / X /4 X /8 X 3 /6 X 4 /64 X 5 /64 X 6 /64 X 7 /64 {A } = {A 6 }+{A 7 } = /3 Huffman Coding X {X } Binary representation /4 {A 3 }={A }+{A } =/6 {A } = {A 4 }+{A 5 } = /3 /8 / while {A i } arrange symbols with decreasing probability Combine variable with lowest prob. to auiliary variable end A i = A a & A b distinguish A a, A b by, {A i }= {A a }+{A b }

12 Illumination of Entropies (X) (Y) (X Y) (X;Y) (Y X) (X,Y) Venn diagram : entropy of source alphabet : entropy of sink alphabet, : joint entropy of source and sink : equivocation: information lost during transmission : irrelevance: information originating not from source ; : mutual information: information correctly sent from source to sink

13 Joint Entropy Joint Entropy (Verbundentropie) X, Y E log X, Y X, Y log X, Y Entropy of random joint variable (, ) uncertainty of observing and jointly X, Y X Y X Y X Y X Y XY XY, E log XY, E log X YX XY,, X Y X X, Y E log log XY, E log X E log YX X YX X, E XY, log X Y Y X Y Y X Y Epectation variable is dropped subseq.: E E Conditioning reduces uncertainty: (Information can t hurt on the average) X Y X YXY 3

14 Chain Rule of Uncertainty Joint Entropy of random variables,,, : Using chain rule of probabilities yields X, X,, X E log X, X,, X N X X XN X X X XN X X XN,,,,,, N n Xn X, X,, X N n X, X,, X N E log X n X, X,, X n N N n N X n X X X n X n X X X n E log,,,,,, n n e.g. X, Y, Z X Y X Z X, Y X Z X Y X, Z 4

15 Equivocation Equivocation (Äquivokation) XY, X Y EXY, log X Y E XY, log Y X, Y Y X Y X Y, log Conditional entropy: Uncertainty about, once is known Information that contains if is known information lost during transmission Derivation : conditional entropy of given the equivocation becomes XYY X Y log X Y XY Y XY Y X log Y Y X Y X og Y r X Y, l P using Y X Y X Y, 5

16 Derivation : XY XY, Y Conditioning reduces uncertainty X Y X Equivocation X, Ylog X, YY logy X Y X Y X, Y log Y, log, X, Y If and are not independent, the knowledge of reduces the randomness about on average Communication: if r signal is given, uncertainty about t signal is reduced og Y X, Y log X, Y l X Y using Y X Y, 6

17 Irrelevance and Mutual Information Irrelevance (Fehlinformation) Mutual Information (Wechselseitige Information, Transinformation) represents uncertainty about before we know, represents the uncertainty about after is received = amount of information provided about by (communication) with operties YX E log YX XY, X X, Y log Y X YX Y XY ; X XY Y Y X X Y X, Y Information that contains if is known e.g. noise X ; Y Y ; X mutuality X Y X, Y Y YX XY, X XY ; 7

18 Mutual Information Mutual Information with Entropy definitions X ; YXlogX Y logy X, Ylog X, Y X, Y X Y X Y Y X Y X X Y Y X Y X X g X, Y, log log lo XY Y X X ; log Y X Y X X X ; Y X Y X, Y a a b, ab ab b, Depends only on transition probabilities and input statistic 8

19 Alternative Forms Alternative Forms for the Mutual Information XY X Y X, Y X Y Y X Y ;, log X Mutual Information is given by the average (taken over and ): You may also find ; in the literature X, Y log X, Y log X YX XY XY, XY ; E log E log E log Y X X Y Y 9

20 Channel Capacity (Kanalkapazität) (X Y) (X) (X;Y) (Y) (Y X) Supremum: least upper bound, e.g., sup / for Channel capacity: Maimum of mutual information over all possible input statistics C sup ( X ; Y ) sup Y X X log { X } { X } Y X Y X X bits per channel use bits/s/hz

21 Noisy-Channel Coding Theorem For every discrete memoryless channel, the channel capacity has the following property: C sup ( X ; Y ) { X } For any and, for large enough, there eists a code of length and rate and a decoding algorithm, such that the maimal probability of block error is. If a probability of bit error is acceptable, rates up to are achievable, where C R c P b P For any, rates greater than are not achievable. In practice optimization of input statistic is often not possible k XY Y X b ( ; ) log k Y Y X X For equally distributed input symbols the mutual information depends only on transition probabilities

22 SHANNON S CHANNEL CAPACITY FOR DIFFERENT CHANNELS

23 Channel Capacity for BSC and Mutual Information for Different Statistics of Input Signal MI(P e ) Mutual Information (MI) of BSC Symmetric input P e {X } =. {X } =.3 {X } =.5 -P e P e X -P e Symmetric input statistic: C ma = bit/s/hz for P e = and P e = error free transmission without coding Capacity decreases with increasing P e C min = bit/s/hz for P e =.5 For P e. C =.5 bit/s/hz with optimal channel coding of rate R c < / error free transmission is possible in theory Non symmetric input statistic: Reduction in mutual information due to symmetry of channel P e Y BSC C P e P e P e P e H P e X log log ( ) Y derivation: appendi & eercise 3

24 Quantization Bounds for BSEC Parameter has to be optimized with respect to channel capacity Optimal choice depends on signal-to-noise-ratio / X -P e -P q Y P q P e X P e P q Y Y -P e -P q X = - X = + -a +a Y Y Y Choice of parameter yields and (by integration of pdf over decision interval) 4

25 Channel Capacity for BSC and BSEC BSEC C P q P e log P e P e P q log P e P q P q log P q BSEC.8 C C.6.4 BSEC, a = opt. BSC a opt. a - - E s /N in db a > leads only to minor improvement of channel capacity 5

26 Channels with Continuous Output Alphabet For AWGN channel the output alphabet is given by Continuous amplitudes for result in integral for capacity calculation For equally likely input symbols : p y X ( ; ) log d k p X k XY p y X In general a quantization with respect to q bits takes place Results in a finite number of output symbols y 6

27 Channel Capacity of BPSK and AWGN Channel for Different Quantization Levels Quantization leads to loss of information C decreases Quantization with q = 3 bits leads only to negligible loss in comparison to q = (no quantization).8.6 C.4 q = q =. q = q = 3 gauss N in db E S / 7

28 Differential Entropy Generalization to continuous random variables X with pdf Differential entropy (Differentielle Entropie) X E log p p log p d (X) does not give the amount of information in X theoretically, infinite number of bits required to represent continuous signal (X) can also be negative no physical interpretation Eample: Uniform distribution / a a a p else For a <.5, log a < (X) is negative With /3: X log 4 a log a X log d log a a a a For a finite domain (i.e. fied maimum absolute value), the uniformly distributed random variable leads to the largest differential entropy. 8

29 Differential Entropy Eample: Normal Distribution, with mean and variance p ( ) ep X log p d log log e ln e d p p log l p d og e p d log lo g X log e e (X) depends only on spread of the distribution (i.e. ), but not on the mean The Gaussian random variable has the largest differential entropy among all continuous random variables of same variance. 9

30 Shaping Gain How much power reduction is possible using Gaussian instead of uniformly distributed random variables of equal differential entropy? Entropy of Uniform distribution with variance log G log X X e U e U.4.53dB 6 G U Entropy of Gaussian distribution with variance G A Gaussian random variable achieves the same differential entropy with an average power.53 db less than required by a uniformly distributed random variable 3

31 Channel Capacity for Continuous Input Alphabet Generalization to models with continuous input alphabets Channel capacity C sup Y Y X p Worst case: Gaussian distributed noise maimizes irrelevance (Y X) To maimize sink entropy (Y) the receive signal should be Gaussian distributed Thus, the transmit signal X needs to be Gaussian distributed as well AWGN-channel with power spectral density / with ~, y~, with y E s N / C Y N log e y log e n log log n N / C log E s N Capacity increases with SNR (channel quality) plot C(E s /N ) on slide 7. 3

32 Channel Capacity for Continuous Input Alphabet Recall: Information vector u is mapped to code vector c of length n>k Energy u k channel encoder E b : Average energy per information bit E s : Average energy of each transmit symbol ( code bit ) c n For fair comparison of coded and uncoded systems, the encoder should not increase the energy (i.e. act as an amplifier) No energy increase due to coding E s < E b k E b n E s E s E b R c E b k n R c k E s n E b 3

33 Channel Capacity for Continuous Input Alphabet Capacity of -D AWGN channel E s E b C log log R c N N R c = C implicit equation E log b C C N SNR required to achieve rate C C E b N C Minimum SNR for information bits C E b ln() lim lim ln().59 db C N C No error free communication possible for E b /N < -.59 db C Eb / N in db -.59 db 33

34 Channel Capacity How does the performance of the system depend on the basic resources? Key observation: f() = log (+) is concave, i.e., f () The higher the SNR, the smaller the effect on capacity log log e for log log for Small SNR: doubling power doubles C High SNR: doubling power C increases by bit C E s /N 34

35 BI-AWGNC Direct calculation using relation from Channels with Continuous.5 log Output Alphabet by numerical integration using the pdfs ep Alternative calculation Differential entropy of noise Sink entropy approimated by Monte-Carlo integration with sufficiently large # trials log log log 35

36 BI-AWGNC C Shannon capacity (Gaussian input/output) Soft decision BI-AWGNC Hard decision BSC C Shannon capacity Hard decision BSC Soft decision BI-AWGNC E s /N Hard-decision prior to decoding results in a loss of to db At / dbthe Shannon capacity is E b /N 36

37 Channel with bandwidth Capacity of a Bandlimited Channel Due to the. Nyquist criterion symbols can be transmitted in time Signal to Noise ratio (bandwidth / ) E s S / N BEs Es / T s n BN N Ts N C BC B log S N C B log S 3.3 log S B log N N n bits/s Eample: Telephone channel with B = 3 khz and S/N = 3 db 3 / C 3Hz log 9,9kbit / s V-9 modem: Requires more bandwidth and higher SNR, e.g. B = 4 khz and S/N = 4 db 4 / C 4Hz log 55,8kbit / s In both directions Only in downlink (digital channel from server to telephone switch) 37

38 Data Rates of Communication Systems over Copper Telephone Lines Transmission System Bandwidth Data rate Analog telephone (POTS) ISDN ADSL (ADSL-over-ISDN, Anne B) ADSL+ (ADSL-over-ISDN) 3 Hz 3.4 khz Hz khz U: 38 khz 76 khz D: 76 khz. MHz U: 38 khz 76 khz D: 76 khz. MHz up to 56 kbit/s (typically 4,5 kbyte/s 5 kbyte/s) 64 kbit/s data channel + 6 kbit/s control channel Upstream: Mbit/s Downstream: up to Mbit/s, Upstream: Mbit/s Downstream: up to 4 Mbit/s, Actual data rate depends strongly on distance to telephone switch 38

39 GALLAGER EXPONENT AND CUT-OFF RATE 39

40 Channel capacity Gallager Eponent and Cut-Off Rate No statement about the structure of the code or its performance Describes only the asymptotic behavior for very long codes not suitable to estimate the word error rate given a specific code word length Only a theoretical bound Gallager eponent Achieves a statement for a given code word length n Steps of subsequent derivation Bhattacharyya bound for the error probability Gallager function and Gallager error eponent Cut-Off Rate Interpretation Robert G. Gallager 4

41 Bhattacharyya Bound () Code of rate /with encoding function and decoding function k elements, y n k elements k elements Decoding area: set of receive vectors with decoding result ( i ) i y g y u Decoding areas are disjoint i j for i j 4

42 Bhattacharyya Bound () Error probability for specific information word () i () i () i () i u u u u y u u y P wi ˆ, i () i () i with g u y, is given by sum of transition probabilities over all, i.e. all receive words that lead to decoding error In general, summing over all elements is complicated (description of decoding area) it would be easier to sum up over all possible trick by scaling factor i Special eample with code words (i.e. k = ) P w, y y () Introducing scaling factor () y y () and y y P w, y y () 4

43 Bhattacharyya Bound (3) Multiplying P w, with scaling factor yields upper bound P w Sum over all y achieves further upper bound Decoding areas have not to be known P () () w, y y Equivalent epression for P w, () () y () y r y y () y (), P y y Average word error probability for two code words ({ () }+{ () }=) y () () () () () () y y y ry y y P y y y y () y () P w, () () y () y () P P P w w, w, y y Epression is nonnegative for all y 43

44 Bhattacharyya Bound (4) For a DMC (discrete memoryless channel) n () i () i y y j j j the epression for error probability follows n n () () P y y w j j j j y y j n j y () () y j y j word error probability for a code containing two code words (k = ) Bhattacharyya Bound: Error probability for a general code with k code words k n of length n and ML decoding () i ( ) P y y wi, j j j y i may become very loose if k is large improvement by Gallager 44

45 Gallager Eponent () Including a weighting into the epression for the word error probability () i () i () i () i u u u u y u u y P wi ˆ, i If is transmitted, in case of an error ( ) at least one l fulfills ( ) ( i y y ) ( ) y () i y As this quotient is non-negative for all (), the sum over all quotients k is larger than one. Thus the following relation holds k i with s ( ) y () i y s, y i Aim: choose parameters s and so that the sum tightly fulfills the inequality s < : reduce quotients much larger than < : reduce the sum for k» (a lot of terms in the sum) Gallager factor 45

46 Gallager Eponent () Including the weighting factor in the epression for the word error probability With the choice of and summation over all y the inequality is and simplifies for a DMC to s k ( ) k s s () i () ( ), y i wi y () i i y y y y y i i i P P wi, y s k () i ( ) y y i k n () i ( ) P wi, y j y j j y i Gallager bound: corresponds to Bhattacharyya bound for = upper bound for P w,i Again: due to compleity hard to evaluate for specific code and channel approimation for the error probability by an epectation over all codes 46

47 Gallager Eponent (3) Epectation for P w over all codes under condition () i k P w E Pw, i E \ P, w i y y For a DMC the factorization yields k k with Gallager function n, k E X n n P w y y E, X E,Xlog y y Weighting factor yields an upper bound for word error probability to achieve an appropriate estimation, the inequality has to be tightened with respect to minimize the quotient (Gallager factor) or maimize E (, {X}) Independent of with n j j 47

48 Gallager Eponent and Cut-Off Rate Maimization gives Gallager Eponent E ( R ) ma ma E, X R ma ma log y R X X y each pair of {X} and corresponds to a line with slope - G c c c Up to R crit = achieves maimum Cut-Off Rate ( = Bhattacharyya) important for sequential decoding, which is not efficient for rates larger R computational cut-off rate Lower bound for = G X o P y X R E () ma E, X ma l g r y E ( R ) R R G c c R E G ( R ) ρ ρ c ρ Rcrit R After maimization the Gallager eponent depends only linearly on the code rate R c C E G ( R c ) for R c C E ( R ) for R C G c c 48 Rc

49 Interpretation Question: What is the maimum code rate R c with E G (R c ) From previous figure the maimum is achieved for = Due to / application of l Hospital X E (, X ) E (, ) y l im l og X ; y y c,ma R ma X ; Y C X For each code rate R c = k/n < C a (n,k) block code eists with a word error probability ne G ( R c ) P w Gallager eponent provides a statement not only regarding the channel, but also about the word error rate P w and the block length n y Y 49

50 Interpretation R E ( R ) G c Observation E G ( R c ) for R c C E ( R ) for R C G c c By increasing n for R c < C the error probability P w can be arbitrarily reduced For R c C the Gallager eponent E G (R c ) approaches zero and n must tend to infinity in order to achieve a small error probability n ( R For = the following relation holds R ) c The closer R c is to R the longer the code word length n has to be chosen in order to achieve the same P w Three regions < R c < R : error probability P w is bounded by n and difference R -R c bound can be computed R < R c < C: error probability P w is bounded by n and error eponent E G (R c ) bound is almost not computable C < R c : error probability P w can not be decreased arbitrarily P w P w ne G c ( R ) ρ Rcrit R C Rc 5

51 Comparison of Channel Capacity and Cut-Off Rate for Binary Symmetric Channel (BSC) R for discrete memoryless channel (DMC) with symmetric binary input ({}={}=.5) R E G () log y y log y y 4 y log y y y C R { }= { }= -P e, { }= { }=P e R log P e P e Comparison shows that R is weaker than capacity term P e 5

52 APPENDIX 5

53 Epectation () Discrete random variable (RV) X taking values in the alphabet ={X,X,X,...} with probability {X=X }={X } Epected value of function f() E X f X X f X Mean value of X: f() = Continuous RV: The epected value of a measurable function f(x) of X, given that X has a probability density function p(), is given by the inner product of p and f: E X f X f p d Mean value of X: f() = E X X X X E E X X p d Variance of X:f() = (-) X X X p d 53

54 Epectation () Linearity: The epected value operator (or epectation operator) E is linear E E X c X c X Y X Y E E E Note that the second result is valid even if X is not statistically independent of Y. Combining the results from previous three equations, we can see that E X Y c E X E Y c 54

55 Rules for Logarithm The logarithm of to base, denoted log, is the unique real number such that Bases : common logarithm : natural logarithm log ln : binary logarithm log ld Change of base Rules ylog log log log log log oduct log log log Power log log Quotient log log log Root log log Eample: ln ln log log log log 55

56 Conveity and Jensens Inequality Conve function over interval, fulfills for every,,and ep i.e. the line segment between,, lies above the graph and The second derivative of conve function is non-negative over the interval Eamples:,, and log, log for Function is concave if is conve Eamples: log, 56

57 Jensen Inequality If is a conve function and is a random variable the inequality holds The inequality states that the conve transformation of a mean is less than or equal to the mean applied after conve transformation Physical version: If a collection of masses are placed on a conve curve at locations,, then the resulting center of mass given by, ) lies above the curve Eample: variables,,with.5 If is a concave function and is a random variable 57

58 Entropy of sink Irrelevance BSC Capacity () log log log Y E Y Y Y Y Y log log bit / s/ Hz YX E log YX X, Y log Y X X Y Y X X Y Y X X Y Y X X Y Y X, log, log, log, log P e log P e P e log P e P e log P e P e log P e P log P P log P P e e e e e Capacity BSC C Y Y X P e P e P e P e P e log log ( ) 58

59 BSC Capacity () BSC: y e with e ~ Bern(P e ) (Bernoulli distribution) Irrelevance YX X EX XX EX E P P log P P log P e e e e e Sink entropy: For ~ Bern(.5) y ~ Bern(.5) Capacity C sup ( X ; Y ) sup ( Y ) ( Y X ) { X } { X } sup ( Y ) ( P ) { X } ( P ) e P P P P log log e e e e e 59

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Shannon Information Theory

Shannon Information Theory Chapter 3 Shannon Information Theory The information theory established by Shannon in 948 is the foundation discipline for communication systems, showing the potentialities and fundamental bounds of coding.

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

ITCT Lecture IV.3: Markov Processes and Sources with Memory

ITCT Lecture IV.3: Markov Processes and Sources with Memory ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.

More information

Communication Limits with Low Precision Analog-to-Digital Conversion at the Receiver

Communication Limits with Low Precision Analog-to-Digital Conversion at the Receiver This full tet paper was peer reviewed at the direction of IEEE Communications Society subject matter eperts for publication in the ICC 7 proceedings. Communication Limits with Low Precision Analog-to-Digital

More information

The binary entropy function

The binary entropy function ECE 7680 Lecture 2 Definitions and Basic Facts Objective: To learn a bunch of definitions about entropy and information measures that will be useful through the quarter, and to present some simple but

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70 Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Optimum Soft Decision Decoding of Linear Block Codes

Optimum Soft Decision Decoding of Linear Block Codes Optimum Soft Decision Decoding of Linear Block Codes {m i } Channel encoder C=(C n-1,,c 0 ) BPSK S(t) (n,k,d) linear modulator block code Optimal receiver AWGN Assume that [n,k,d] linear block code C is

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

x log x, which is strictly convex, and use Jensen s Inequality:

x log x, which is strictly convex, and use Jensen s Inequality: 2. Information measures: mutual information 2.1 Divergence: main inequality Theorem 2.1 (Information Inequality). D(P Q) 0 ; D(P Q) = 0 iff P = Q Proof. Let ϕ(x) x log x, which is strictly convex, and

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Introduction to Low-Density Parity Check Codes. Brian Kurkoski Introduction to Low-Density Parity Check Codes Brian Kurkoski kurkoski@ice.uec.ac.jp Outline: Low Density Parity Check Codes Review block codes History Low Density Parity Check Codes Gallager s LDPC code

More information

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved. Introduction to Wireless & Mobile Systems Chapter 4 Channel Coding and Error Control 1 Outline Introduction Block Codes Cyclic Codes CRC (Cyclic Redundancy Check) Convolutional Codes Interleaving Information

More information

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS Justin Dauwels Dept. of Information Technology and Electrical Engineering ETH, CH-8092 Zürich, Switzerland dauwels@isi.ee.ethz.ch

More information

Coding theory: Applications

Coding theory: Applications INF 244 a) Textbook: Lin and Costello b) Lectures (Tu+Th 12.15-14) covering roughly Chapters 1,9-12, and 14-18 c) Weekly exercises: For your convenience d) Mandatory problem: Programming project (counts

More information

Chapter 7: Channel coding:convolutional codes

Chapter 7: Channel coding:convolutional codes Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication

More information

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels Asymptotic istortion Performance of Source-Channel iversity Schemes over Relay Channels Karim G. Seddik 1, Andres Kwasinski 2, and K. J. Ray Liu 1 1 epartment of Electrical and Computer Engineering, 2

More information

Chapter 4: Continuous channel and its capacity

Chapter 4: Continuous channel and its capacity meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

Lecture 4. Capacity of Fading Channels

Lecture 4. Capacity of Fading Channels 1 Lecture 4. Capacity of Fading Channels Capacity of AWGN Channels Capacity of Fading Channels Ergodic Capacity Outage Capacity Shannon and Information Theory Claude Elwood Shannon (April 3, 1916 February

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

EE-597 Notes Quantization

EE-597 Notes Quantization EE-597 Notes Quantization Phil Schniter June, 4 Quantization Given a continuous-time and continuous-amplitude signal (t, processing and storage by modern digital hardware requires discretization in both

More information

The PPM Poisson Channel: Finite-Length Bounds and Code Design

The PPM Poisson Channel: Finite-Length Bounds and Code Design August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006) MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK SATELLITE COMMUNICATION DEPT./SEM.:ECE/VIII UNIT V PART-A 1. What is binary symmetric channel (AUC DEC 2006) 2. Define information rate? (AUC DEC 2007)

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

ECE 587 / STA 563: Lecture 5 Lossless Compression

ECE 587 / STA 563: Lecture 5 Lossless Compression ECE 587 / STA 563: Lecture 5 Lossless Compression Information Theory Duke University, Fall 2017 Author: Galen Reeves Last Modified: October 18, 2017 Outline of lecture: 5.1 Introduction to Lossless Source

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK DEPARTMENT: ECE SEMESTER: IV SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A 1. What is binary symmetric channel (AUC DEC

More information

ECE 587 / STA 563: Lecture 5 Lossless Compression

ECE 587 / STA 563: Lecture 5 Lossless Compression ECE 587 / STA 563: Lecture 5 Lossless Compression Information Theory Duke University, Fall 28 Author: Galen Reeves Last Modified: September 27, 28 Outline of lecture: 5. Introduction to Lossless Source

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

Upper Bounds to Error Probability with Feedback

Upper Bounds to Error Probability with Feedback Upper Bounds to Error robability with Feedbac Barış Naiboğlu Lizhong Zheng Research Laboratory of Electronics at MIT Cambridge, MA, 0239 Email: {naib, lizhong }@mit.edu Abstract A new technique is proposed

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Decision-Point Signal to Noise Ratio (SNR)

Decision-Point Signal to Noise Ratio (SNR) Decision-Point Signal to Noise Ratio (SNR) Receiver Decision ^ SNR E E e y z Matched Filter Bound error signal at input to decision device Performance upper-bound on ISI channels Achieved on memoryless

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Lecture 15: Thu Feb 28, 2019

Lecture 15: Thu Feb 28, 2019 Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN:

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design Chapter 4 Receiver Design Chapter 4 Receiver Design Probability of Bit Error Pages 124-149 149 Probability of Bit Error The low pass filtered and sampled PAM signal results in an expression for the probability

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Principles of Coded Modulation. Georg Böcherer

Principles of Coded Modulation. Georg Böcherer Principles of Coded Modulation Georg Böcherer Contents. Introduction 9 2. Digital Communication System 2.. Transmission System............................. 2.2. Figures of Merit................................

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding Codes on graphs and iterative decoding Bane Vasić Error Correction Coding Laboratory University of Arizona Funded by: National Science Foundation (NSF) Seagate Technology Defense Advanced Research Projects

More information

Audio Coding. Fundamentals Quantization Waveform Coding Subband Coding P NCTU/CSIE DSPLAB C.M..LIU

Audio Coding. Fundamentals Quantization Waveform Coding Subband Coding P NCTU/CSIE DSPLAB C.M..LIU Audio Coding P.1 Fundamentals Quantization Waveform Coding Subband Coding 1. Fundamentals P.2 Introduction Data Redundancy Coding Redundancy Spatial/Temporal Redundancy Perceptual Redundancy Compression

More information

Scalar and Vector Quantization. National Chiao Tung University Chun-Jen Tsai 11/06/2014

Scalar and Vector Quantization. National Chiao Tung University Chun-Jen Tsai 11/06/2014 Scalar and Vector Quantization National Chiao Tung University Chun-Jen Tsai 11/06/014 Basic Concept of Quantization Quantization is the process of representing a large, possibly infinite, set of values

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

State-of-the-Art Channel Coding

State-of-the-Art Channel Coding Institut für State-of-the-Art Channel Coding Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs

Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs Yair Carmon, Shlomo Shamai and Tsachy Weissman arxiv:36.578v4 [cs.it] Nov 4 Abstract We compare the maimum achievable

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005 Chapter 7 Error Control Coding Mikael Olofsson 2005 We have seen in Chapters 4 through 6 how digital modulation can be used to control error probabilities. This gives us a digital channel that in each

More information

On the Limits of Communication with Low-Precision Analog-to-Digital Conversion at the Receiver

On the Limits of Communication with Low-Precision Analog-to-Digital Conversion at the Receiver 1 On the Limits of Communication with Low-Precision Analog-to-Digital Conversion at the Receiver Jaspreet Singh, Onkar Dabeer, and Upamanyu Madhow, Abstract As communication systems scale up in speed and

More information

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University Quantization C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University http://www.csie.nctu.edu.tw/~cmliu/courses/compression/ Office: EC538 (03)5731877 cmliu@cs.nctu.edu.tw

More information

Lecture 6 Channel Coding over Continuous Channels

Lecture 6 Channel Coding over Continuous Channels Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Data Compression. Limit of Information Compression. October, Examples of codes 1

Data Compression. Limit of Information Compression. October, Examples of codes 1 Data Compression Limit of Information Compression Radu Trîmbiţaş October, 202 Outline Contents Eamples of codes 2 Kraft Inequality 4 2. Kraft Inequality............................ 4 2.2 Kraft inequality

More information

Constructing Polar Codes Using Iterative Bit-Channel Upgrading. Arash Ghayoori. B.Sc., Isfahan University of Technology, 2011

Constructing Polar Codes Using Iterative Bit-Channel Upgrading. Arash Ghayoori. B.Sc., Isfahan University of Technology, 2011 Constructing Polar Codes Using Iterative Bit-Channel Upgrading by Arash Ghayoori B.Sc., Isfahan University of Technology, 011 A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree

More information

Problem Set 7 Due March, 22

Problem Set 7 Due March, 22 EE16: Probability and Random Processes SP 07 Problem Set 7 Due March, Lecturer: Jean C. Walrand GSI: Daniel Preda, Assane Gueye Problem 7.1. Let u and v be independent, standard normal random variables

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Performance of small signal sets

Performance of small signal sets 42 Chapter 5 Performance of small signal sets In this chapter, we show how to estimate the performance of small-to-moderate-sized signal constellations on the discrete-time AWGN channel. With equiprobable

More information

The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels

The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels Parastoo Sadeghi National ICT Australia (NICTA) Sydney NSW 252 Australia Email: parastoo@student.unsw.edu.au Predrag

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Turbo Codes for xdsl modems

Turbo Codes for xdsl modems Turbo Codes for xdsl modems Juan Alberto Torres, Ph. D. VOCAL Technologies, Ltd. (http://www.vocal.com) John James Audubon Parkway Buffalo, NY 14228, USA Phone: +1 716 688 4675 Fax: +1 716 639 0713 Email:

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Digital communication system. Shannon s separation principle

Digital communication system. Shannon s separation principle Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information