ELEC 405/511 Error Control Coding. Binary Convolutional Codes
|
|
- Tracy McDaniel
- 6 years ago
- Views:
Transcription
1 ELEC 405/511 Error Control Coding Binary Convolutional Codes
2 Peter Elias ( ) Coding for Noisy Channels,
3 With block codes, the input data is divided into blocks of length k and the codewords have blocklength n. With convolutional codes, the input data is encoded in a continuous manner. the output is a single codeword encoding is done using only shift registers and XOR gates (binary arithmetic) the encoder output is a function of the current input bits and the shift register contents 3
4 Rate 1/2 Linear Convolutional Encoder Figure
5 Encoding Input data stream x = ( x, x, x, ) Output data streams y = ( y, y, y, ) (0) (0) (0) (0) y = ( y, y, y, ) (1) (1) (1) (1) Multiplex into a single data stream y = ( y, y, y, y, y, y, ) (0) (1) (0) (1) (0) (1)
6 ETSI TS V8.4.0 ( ) Technical Specification LTE; Evolved Universal Terrestrial Radio Access (E- UTRA); Multiplexing and channel coding Rate 1/3 Convolutional Encoder 6
7 7
8 Rate 2/3 Linear Convolutional Encoder 8
9 Rate 1/2 Linear Convolutional Encoder 9
10 Example 11-1 Input data stream first input bit Output data streams x = (10110) y (0) = ( ) y (1) = ( ) Multiplex into a single data stream y = (11,01,01,01,11,01,11,00) 10
11 Fractional Rate Loss Message length L Codeword length is L(n/k) + nm Effective code rate is L Reffective = L n / k + nm Fractional rate loss is γ ( ) R R km R L km effective = = + 11
12 Example 11-2 From Example 11-1, the message length is L = 5 Codeword length is L(n/k) + nm = 5(2/1)+2 3 = 16 Effective code rate is L 5 5 Reffective = = = L n / k + nm ( ) Fractional rate loss is γ R R km R L + km effective = = =
13 Fractional Rate Loss Typically k = 1 and L is large: L» m In this case R effective 1 L 1 = as L n L+ m n ( ) Fractional rate loss becomes m γ = 0 L+ m 13
14 Memory order m: length of the longest shift register m = max m i Constraint length: K = 1+ max m i Total memory (overall constraint length): sum of all memory elements M = Σ m i Number of input streams: k Number of output streams: n Code rate: k/n State: contents of the shift registers Initial state: 00 0 End state: 00 0 (input zeros at the end) 14
15 Punctured Convolutional Codes Complexity is a function of the total memory or overall constraint length Thus codes with k > 1 are rarely used Puncturing allows for higher code rates from a rate 1/2 code For rate 2/3, delete one of every 4 output bits input x 1 x 2 output y 1 y 2 y 3 y 4 remove y 3 15
16 Rate 1/2 Linear Convolutional Encoder 16
17 Rate 2/3 Linear Convolutional Encoder 17
18 Generator Sequences () i g j represents the ith output of an encoder when the jth input is a single 1 followed by a string of zeros ( j) x = δ = (10000 ) Rate 1/2 example (K = 4) g g (0) (1) = (1011) = (1101) Rate 2/3 example g g g = (1001) g = (0110) (0) (0) 0 1 = (0111) g = (1010) (1) (1) 0 1 = (1100) g = (0100) (2) (2)
19 Why Convolutional Code? For the rate 1/2 (2,1,4) code with impulse response g (0) = 1011 g (1) = 1101 The output of the encoder is y (0) = x * g (0) y (1) = x * g (1) If x = (10110), the output is y (0) = (10110)*(1011)= y (1) = (10110)*(1101)=
20 Rate 1/2 Linear Convolutional Encoder (2,1,3) code g (0) = 111 g (1) =
21 A convolutional code is in fact a linear block code for finite length L. For a rate 1/2 code, the generator matrix can be formed by interleaving g (0) and g (1) G = (0) (1) (0) (1) (0) (1) g0 g 0 g1 g 1 gmg m (0) (1) (0) (1) (0) (1) g0 g 0 g1 g 1 gmgm 0 0 L x 2(m+L) 21
22 For the rate 1/2 (2,1,4) code G = For the rate 2/3 (3,2,4) code G =
23 Example 11-3 (2,1,4) code input x = (1011), L = 4, m = 3 G has dimension 4 x G = y=xg = (11,01,01,01,11,01,11) 23
24 Polynomial Representation Introduce the delay operator D y y x (,,, ) ( ) 2 = x0 x1 x2 X D= x0 + xd 1 + xd 2 + (0) (0) (0) (0) (0) (0) (0) (0) = ( y, y, y, ) Y ( D) = y + y D+ y D + = ( y, y, y, ) Y ( D) = y + y D+ y D + (1) (1) (1) (1) (1) (1) (1) (1) () i () i g G ( D) 24
25 (2,1,4) Convolutional Encoder 25
26 Polynomial Representation For the (2,1,4) code example G (0) (D) = 1+D 2 +D 3 X(D) = 1+D 2 +D 3 G (1) (D) = 1+D+D 3 Use polynomial multiplication to obtain the output Y (0) (D) = X(D) G (0) (D) = 1+D 4 +D 6 Y (1) (D) = X(D) G (1) (D) = 1+D+D 2 +D 3 +D 4 +D 5 +D 6 Y(D) = Y (0) (D 2 ) + DY (1) (D 2 ) = 1+D+D 3 +D 5 +D 7 +D 8 +D 9 +D 11 +D 12 +D 13 26
27 Rate 1/2 (2,1,3) Code G (0) (D) = 1+D+D 2 G (1) (D) = 1+D 2 X(D) = 1+D 3 +D 4 Y (0) (D) = X(D) G (0) (D) = 1+D+D 2 +D 3 +D 6 Y (1) (D) = X(D) G (1) (D) = 1+D 2 +D 3 +D 4 +D 5 +D 6 Y(D) = Y (0) (D 2 ) + DY (1) (D 2 ) = 1+D+D 2 +D 4 +D 5 +D 6 +D 7 +D 9 +D 11 +D 12 +D 13 27
28 For 3 outputs Y(D) = Y (0) (D 3 ) + DY (1) (D 3 ) + D 2 Y (2) (D 3 ) For a multiple input, multiple output system k 1 () i ( j) () i = j j= 0 Y ( D) X ( D) G ( D) G = [ ( D) ( D) ( D)] G X X X G () i 0 () i (0) (1) ( k 1) 1 () i k 1 ( D) ( D) ( D) 28
29 ( (0) (1) ( n 1) ) Y( D) = Y ( D), Y ( D),, Y ( D) (0) (1) ( n 1) G0 ( D) G0 ( D) G0 ( D) (0) (1) ( n 1) (0) (1) ( k 1) 1 ( D) 1 ( D) 1 ( D) ( D) ( D) [ ( D) ( D) ( D)] G G G = X G = X X X (0) (1) ( n 1) Gk 1( D) Gk 1( D) Gk 1 ( D) Y n 1 i () i n ( ) ( ) D = D Y D i= 0 G(D) G(D) is called the transfer-function matrix for the encoder 29
30 For the (2,1,4) code G(D) = [1+D 2 +D 3 1+D+D 3 ] For the (3,2,4) code G G G = 1 + D G = D+ D (0) 3 (0) = D+ D + D G = 1 + D (1) 2 3 (1) = 1 + D G = D (2) (2) D D+ D + D 1+ D G( D) = 2 2 D+ D 1 + D D 30
31 Example 11-4 Rate 2/3 Code Let x = (11,10,11) X (0) (D) = 1+D+D 2 X (1) (D) = 1+D 2 Y(D) = X(D)G(D) D+ D 1 + D D D D+ D + D 1+ D = 1+ D+ D 1+ D = 1+ D 1+ D+ D + D + D 1+ D 31
32 To convert from matrix form to the output string Y 2 i () i 3 ( ) ( ) D = D Y D i= D D(1 D D D D ) D (1 D ) = = 1 + D+ D + D + D + D + D + D + D The output codeword is then y = (111,011,000,010,010,110) 32
33 Systematic Convolutional Codes 33
34 Systematic Convolutional Codes The input bits are reproduced unaltered in the codeword G P l I P0 0 P1 0 P2 0 Pm I P 0 P 0 P 0 P 0 1 m 1 m = I P0 0 Pm 2 0 Pm 1 0 Pm ( k) ( k+1) ( n-1) g0,l g0,l g 0,l ( k) ( k+1) ( n-1) g1, l g1, l g1, l = ( k) ( k+1) ( n-1) gk-1, lgk-1, l g k-1, l I: k x k identity matrix 0: k x k all-zero matrix P l : k x (n-k) matrix 34
35 Systematic Encoder Example (2,1,4) systematic encoder g = (1000) g = (1101) (0) (1) 35
36 Systematic Encoder The generator matrix is G = G(D) = [1 1+D+D 3 ] For X(D) = 1+D 2 +D 3 Y (0) (D) = X(D) G (0) (D) = X(D) = 1+D 2 +D 3 Y (1) (D) = X(D) G (1) (D) = 1+D+D 2 +D 3 +D 4 +D 5 +D 6 36
37 Y(D) = Y (0) (D 2 ) + DY (1) (D 2 ) = 1+D 4 +D 6 + D(1+D 2 +D 4 +D 6 +D 8 +D 10 +D 12 ) = 1+D+D 3 +D 4 +D 5 +D 6 +D 7 +D 9 +D 11 +D 13 The output codeword is then y = (11,01,11,11,01,01,01) x = (1011) y = xg G =
38 LTE Turbo Code GD ( ) 1, 0 1 g( D) 1 = g( 0 D) g( D) = 1+ D + D 2 3 g( D) = 1+ D+ D 3 38
39 State Diagrams The state of an encoder is defined as the shift register contents For an (n,1,k) code there are 2 K-1 = 2 m states Each branch in the state diagram has a label of the form X/Y, where X is the input bit that causes the state transition and Y is the corresponding output bits. Path: a sequence of nodes connected by branches Circuit: a path that starts and stops at the same node Loop: a circuit that does not enter any node more than once. 39
40 Rate 1/2 Linear Convolutional Encoder 40
41 State Diagram for the (2,1,4) Encoder 41
42 Example 11-5: Consider the input x = (1011) Begin at state S 0 input output state - - S S S S S S S S 0 42
43 Catastrophic Codes Catastrophic code: A code for which an infinite weight input causes a finite weight output Result: A finite number of channel errors can cause an infinite number of decoding errors In terms of the state diagram: A loop corresponding to a nonzero input for which all the output bits are zero denotes a catastrophic code 43
44 Catastrophic Encoder 44
45 g (0) = (1111) g (1) = (1111) x = ( ) y = (11,00,11,00,00,00, ) Now suppose r = (10,00,00,00,00,00, ) A maximum likelihood decoder will choose c' = (00,00,00,00,00,00, ) which corresponds to x' = ( ) and an infinite number of errors 45
46 State Diagram for a Catastrophic Code 46
47 Another Catastrophic Encoder g (0) = (0111) g (1) = (1110) x = (1,1,0,1,1,0,1,1, ) y = (01,10,00,00,00,00, ) 47
48 The Corresponding State Diagram 48
49 Theorem 11-1 Catastrophic Codes Let C be a rate 1/n convolutional code with transfer function matrix G(D) whose generator sequences have the transforms {G (0) (D), G (1) (D),, G (n-1) (D)}. C is not catastrophic if and only if GCD(G (0) (D), G (1) (D),, G (n-1) (D)) = D l for l 0. 49
50 Catastrophic Code Examples Example 11-6 (previous) code g (0) = (0111) g (1) = (1110) G (0) (D) = D+D 2 +D 3 G (1) (D) = 1+D+D 2 GCD(G (0) (D),G (1) (D)) = 1+D+D 2 D l For the other code g (0) = (1111) g (1) = (1111) GCD(G (0) (D),G (1) (D)) = 1+D+D 2 +D 3 D l 50
51 The Weight Enumerator Label each branch in the state diagram with X i Y j, where i is the weight of the output sequence and j is the weight of the input sequence. The transfer function T( X, Y) = i i a j i, jxy j can be obtained using Mason s formula. 51
52 The Weight Enumerator Each branch can also be labeled with X i L k, where i is the weight of the output sequence and k is the branch length. The transfer function T ( X,) L b XL = i can be obtained using Mason s formula. k ik, i k 52
53 (2,1,3) Convolutional Encoder 53
54 State Diagram for the (2,1,3) Encoder Solid lines represent input 0 Dashed lines represent input 1 state SR contents a 00 b 10 c 01 d 11 54
55 Example Consider the input sequence x = (10011) Begin at state a input output new state b c a b d To return to the all zero state: input 0 0 output new state c a 55
56 Modified State Diagram for the (2,1,3) Code XY X = output weight Y = input weight XY X X 2 Y X X 2 Y 56
57 Modified State Diagram for the (2,1,3) Code XL X = output weight L = branch length XL XL X 2 L XL X 2 L L 57
58 Mason s Formula TXY (, ) = i F i i Forward paths F i Graph determinant = 1 C + CC CC C + l l m l m n L L L L L L (, ) (,, ) l l m l m n Cofactor of path K i = 1 C + CC CC C + i l l m l m n ( K, L ) ( K, L, L ) ( K, L, L, L ) i l i l m i l m n 58
59 (2,1,3) Code Two forward paths a 0 bca 1 F 1 = X 2 Y X X 2 =X 5 Y a 0 bdca 1 F 2 = X 2 Y XY X X 2 =X 6 Y 2 Three loops dd C 1 = XY bdcb C 2 = XY X Y=X 2 Y 2 bcb C 3 = X Y=XY =1-(XY+X 2 Y 2 +XY)+XY XY=1-2XY 1 =1-XY 2 =1 59
60 (2,1,3) Code TXY (, ) F i i i = = X Y(1 XY) + X Y 1 2XY XY = 1 2XY X TX ( ) X 2X 4X 1 2X = =
61 (2,1,3) Code Two forward paths a 0 bca 1 F 1 = X 2 L XL X 2 L=X 5 L 3 a 0 bdca 1 F 2 = X 2 L XL XL X 2 L=X 6 L 4 Three loops dd C 1 = XL bdcb C 2 = XL XL L=X 2 L 3 bcb C 3 = XL L=XL 2 =1-(XL+X 2 L 3 +XL 2 )+XL XL 2 =1-XL-XL 2 1 =1-XL 2 =1 61
62 (2,1,3) Code TXL (,) Fi i i X L (1 XL) + X L = = 1 XL(1 + L) XL = 1 XL(1 + L) X TX ( ) X 2X 4X 1 2X = =
63 (2,1,4) Code T(X,Y) = X 6 (Y+Y 3 )+X 8 (3Y 2 +5Y 4 +2Y 6 )+ T(X) = 2X 6 +10X 8 +49X
64 Minimum Free Distance (2,1,4) code T(X) = 2X 6 +10X 8 +49X 10 + The minimum free distance d free is the minimum Hamming distance between all pairs of convolutional codewords. It is the minimum weight of an output sequence starting from the all-zero state and ending in the all-zero state. dfree = min{ d( y', y'') y' y''} = min{ w( y) y 0} For the (2,1,4) code, there are two codewords of weight 6, thus d free = 6. 64
65 Maximum Free Distance Codes Nonsystematic convolutional codes typically provide a higher minimum free distance than systematic codes with the same constraint length and rate. 65
66 66
67 Best Known Convolutional Codes 67
68 68
69 Peter Elias ( ) Coding for Noisy Channels 69
Chapter10 Convolutional Codes. Dr. Chih-Peng Li ( 李 )
Chapter Convolutional Codes Dr. Chih-Peng Li ( 李 ) Table of Contents. Encoding of Convolutional Codes. tructural Properties of Convolutional Codes. Distance Properties of Convolutional Codes Convolutional
More informationExample of Convolutional Codec
Example of Convolutional Codec Convolutional Code tructure K k bits k k k n- n Output Convolutional codes Convoltuional Code k = number of bits shifted into the encoder at one time k= is usually used!!
More informationBinary Convolutional Codes
Binary Convolutional Codes A convolutional code has memory over a short block length. This memory results in encoded output symbols that depend not only on the present input, but also on past inputs. An
More informationIntroduction to Binary Convolutional Codes [1]
Introduction to Binary Convolutional Codes [1] Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw Y. S. Han Introduction
More informationConvolutional Codes. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 6th, 2008
Convolutional Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete November 6th, 2008 Telecommunications Laboratory (TUC) Convolutional Codes November 6th, 2008 1
More informationCoding on a Trellis: Convolutional Codes
.... Coding on a Trellis: Convolutional Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete November 6th, 2008 Telecommunications Laboratory (TUC) Coding on a Trellis:
More informationLecture 3 : Introduction to Binary Convolutional Codes
Lecture 3 : Introduction to Binary Convolutional Codes Binary Convolutional Codes 1. Convolutional codes were first introduced by Elias in 1955 as an alternative to block codes. In contrast with a block
More informationNAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00
NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR Sp ' 00 May 3 OPEN BOOK exam (students are permitted to bring in textbooks, handwritten notes, lecture notes
More informationIntroduction to Convolutional Codes, Part 1
Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes
More informationPhysical Layer and Coding
Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:
More informationEE 229B ERROR CONTROL CODING Spring 2005
EE 229B ERROR CONTROL CODING Spring 2005 Solutions for Homework 1 1. Is there room? Prove or disprove : There is a (12,7) binary linear code with d min = 5. If there were a (12,7) binary linear code with
More informationConvolutional Codes ddd, Houshou Chen. May 28, 2012
Representation I, II Representation III, IV trellis of Viterbi decoding Turbo codes Convolutional Codes ddd, Houshou Chen Department of Electrical Engineering National Chung Hsing University Taichung,
More informationCode design: Computer search
Code design: Computer search Low rate codes Represent the code by its generator matrix Find one representative for each equivalence class of codes Permutation equivalences? Do NOT try several generator
More informationChapter 7: Channel coding:convolutional codes
Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication
More information1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g
Exercise Generator polynomials of a convolutional code, given in binary form, are g 0, g 2 0 ja g 3. a) Sketch the encoding circuit. b) Sketch the state diagram. c) Find the transfer function TD. d) What
More informationAppendix D: Basics of convolutional codes
Appendix D: Basics of convolutional codes Convolutional encoder: In convolutional code (B. P. Lathi, 2009; S. G. Wilson, 1996; E. Biglieri, 2005; T. Oberg, 2001), the block of n code bits generated by
More informationSolutions or answers to Final exam in Error Control Coding, October 24, G eqv = ( 1+D, 1+D + D 2)
Solutions or answers to Final exam in Error Control Coding, October, Solution to Problem a) G(D) = ( +D, +D + D ) b) The rate R =/ and ν i = ν = m =. c) Yes, since gcd ( +D, +D + D ) =+D + D D j. d) An
More informationChapter 3 Linear Block Codes
Wireless Information Transmission System Lab. Chapter 3 Linear Block Codes Institute of Communications Engineering National Sun Yat-sen University Outlines Introduction to linear block codes Syndrome and
More informationAn Introduction to Low Density Parity Check (LDPC) Codes
An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,
More informationChannel Coding and Interleaving
Lecture 6 Channel Coding and Interleaving 1 LORA: Future by Lund www.futurebylund.se The network will be free for those who want to try their products, services and solutions in a precommercial stage.
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More informationCyclic Codes. Saravanan Vijayakumaran August 26, Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 25 Cyclic Codes Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 26, 2014 2 / 25 Cyclic Codes Definition A cyclic shift
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationx n k m(x) ) Codewords can be characterized by (and errors detected by): c(x) mod g(x) = 0 c(x)h(x) = 0 mod (x n 1)
Cyclic codes: review EE 387, Notes 15, Handout #26 A cyclic code is a LBC such that every cyclic shift of a codeword is a codeword. A cyclic code has generator polynomial g(x) that is a divisor of every
More informationCyclic codes: overview
Cyclic codes: overview EE 387, Notes 14, Handout #22 A linear block code is cyclic if the cyclic shift of a codeword is a codeword. Cyclic codes have many advantages. Elegant algebraic descriptions: c(x)
More informationTurbo Codes are Low Density Parity Check Codes
Turbo Codes are Low Density Parity Check Codes David J. C. MacKay July 5, 00 Draft 0., not for distribution! (First draft written July 5, 998) Abstract Turbo codes and Gallager codes (also known as low
More informationLecture 4: Linear Codes. Copyright G. Caire 88
Lecture 4: Linear Codes Copyright G. Caire 88 Linear codes over F q We let X = F q for some prime power q. Most important case: q =2(binary codes). Without loss of generality, we may represent the information
More informationIntroduction to convolutional codes
Chapter 9 Introduction to convolutional codes We now introduce binary linear convolutional codes, which like binary linear block codes are useful in the power-limited (low-snr, low-ρ) regime. In this chapter
More informationIntroduction to Low-Density Parity Check Codes. Brian Kurkoski
Introduction to Low-Density Parity Check Codes Brian Kurkoski kurkoski@ice.uec.ac.jp Outline: Low Density Parity Check Codes Review block codes History Low Density Parity Check Codes Gallager s LDPC code
More informationIntroduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.
Introduction to Wireless & Mobile Systems Chapter 4 Channel Coding and Error Control 1 Outline Introduction Block Codes Cyclic Codes CRC (Cyclic Redundancy Check) Convolutional Codes Interleaving Information
More informationDigital Communication Systems ECS 452. Asst. Prof. Dr. Prapun Suksompong 5.2 Binary Convolutional Codes
Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 5.2 Binary Convolutional Codes 35 Binary Convolutional Codes Introduced by Elias in 1955 There, it is referred
More informationLDPC Codes. Slides originally from I. Land p.1
Slides originally from I. Land p.1 LDPC Codes Definition of LDPC Codes Factor Graphs to use in decoding Decoding for binary erasure channels EXIT charts Soft-Output Decoding Turbo principle applied to
More informationFault Tolerance & Reliability CDA Chapter 2 Cyclic Polynomial Codes
Fault Tolerance & Reliability CDA 5140 Chapter 2 Cyclic Polynomial Codes - cylic code: special type of parity check code such that every cyclic shift of codeword is a codeword - for example, if (c n-1,
More informationECEN 655: Advanced Channel Coding
ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History
More informationNon-Linear Turbo Codes for Interleaver-Division Multiple Access on the OR Channel.
UCLA Graduate School of Engineering - Electrical Engineering Program Non-Linear Turbo Codes for Interleaver-Division Multiple Access on the OR Channel. Miguel Griot, Andres I. Vila Casado, and Richard
More informationSIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land
SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:
More informationTurbo Codes for Deep-Space Communications
TDA Progress Report 42-120 February 15, 1995 Turbo Codes for Deep-Space Communications D. Divsalar and F. Pollara Communications Systems Research Section Turbo codes were recently proposed by Berrou, Glavieux,
More informationSolutions to problems from Chapter 3
Solutions to problems from Chapter 3 Manjunatha. P manjup.jnnce@gmail.com Professor Dept. of ECE J.N.N. College of Engineering, Shimoga February 28, 2016 For a systematic (7,4) linear block code, the parity
More informationG Solution (10 points) Using elementary row operations, we transform the original generator matrix as follows.
EE 387 October 28, 2015 Algebraic Error-Control Codes Homework #4 Solutions Handout #24 1. LBC over GF(5). Let G be a nonsystematic generator matrix for a linear block code over GF(5). 2 4 2 2 4 4 G =
More informationPunctured Convolutional Codes Revisited: the Exact State Diagram and Its Implications
Punctured Convolutional Codes Revisited: the Exact State iagram and Its Implications Jing Li Tiffany) Erozan Kurtas epartment of Electrical and Computer Engineering Seagate Research Lehigh University Bethlehem
More informationGraph-based codes for flash memory
1/28 Graph-based codes for flash memory Discrete Mathematics Seminar September 3, 2013 Katie Haymaker Joint work with Professor Christine Kelley University of Nebraska-Lincoln 2/28 Outline 1 Background
More informationA Short Length Low Complexity Low Delay Recursive LDPC Code
A Short Length Low Complexity Low Delay Recursive LDPC Code BASHAR M. MANSOOR, TARIQ Z. ISMAEEL Department of Electrical Engineering College of Engineering, University of Baghdad, IRAQ bmml77@yahoo.com
More informationCHAPTER 8 Viterbi Decoding of Convolutional Codes
MIT 6.02 DRAFT Lecture Notes Fall 2011 (Last update: October 9, 2011) Comments, questions or bug reports? Please contact hari at mit.edu CHAPTER 8 Viterbi Decoding of Convolutional Codes This chapter describes
More informationAsymptotically Good Convolutional Codes with Feedback Encoders. Peter J. Sallaway OCT
Asymptotically Good Convolutional Codes with Feedback Encoders by Peter J. Sallaway Submitted to the Department of Electrical Engineering and Computer Science in partial fulfillment of the requirements
More informationThe Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks
The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks Gerald Matz Institute of Telecommunications Vienna University of Technology institute of telecommunications Acknowledgements
More informationEncoder. Encoder 2. ,...,u N-1. 0,v (0) ,u 1. ] v (0) =[v (0) 0,v (1) v (1) =[v (1) 0,v (2) v (2) =[v (2) (a) u v (0) v (1) v (2) (b) N-1] 1,...
Chapter 16 Turbo Coding As noted in Chapter 1, Shannon's noisy channel coding theorem implies that arbitrarily low decoding error probabilities can be achieved at any transmission rate R less than the
More informationMaximum Likelihood Decoding of Codes on the Asymmetric Z-channel
Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel Pål Ellingsen paale@ii.uib.no Susanna Spinsante s.spinsante@univpm.it Angela Barbero angbar@wmatem.eis.uva.es May 31, 2005 Øyvind Ytrehus
More informationThe Viterbi Algorithm EECS 869: Error Control Coding Fall 2009
1 Bacground Material 1.1 Organization of the Trellis The Viterbi Algorithm EECS 869: Error Control Coding Fall 2009 The Viterbi algorithm (VA) processes the (noisy) output sequence from a state machine
More informationToday. Wrapup of Polynomials...and modular arithmetic. Coutability and Uncountability.
Today. Wrapup of Polynomials...and modular arithmetic. Coutability and Uncountability. Reed-Solomon code. Problem: Communicate n packets m 1,...,m n on noisy channel that corrupts k packets. Reed-Solomon
More informationFountain Uncorrectable Sets and Finite-Length Analysis
Fountain Uncorrectable Sets and Finite-Length Analysis Wen Ji 1, Bo-Wei Chen 2, and Yiqiang Chen 1 1 Beijing Key Laboratory of Mobile Computing and Pervasive Device Institute of Computing Technology, Chinese
More informationVHDL Implementation of Reed Solomon Improved Encoding Algorithm
VHDL Implementation of Reed Solomon Improved Encoding Algorithm P.Ravi Tej 1, Smt.K.Jhansi Rani 2 1 Project Associate, Department of ECE, UCEK, JNTUK, Kakinada A.P. 2 Assistant Professor, Department of
More informationChapter 4. Combinational: Circuits with logic gates whose outputs depend on the present combination of the inputs. elements. Dr.
Chapter 4 Dr. Panos Nasiopoulos Combinational: Circuits with logic gates whose outputs depend on the present combination of the inputs. Sequential: In addition, they include storage elements Combinational
More informationOptimum Soft Decision Decoding of Linear Block Codes
Optimum Soft Decision Decoding of Linear Block Codes {m i } Channel encoder C=(C n-1,,c 0 ) BPSK S(t) (n,k,d) linear modulator block code Optimal receiver AWGN Assume that [n,k,d] linear block code C is
More informationLinear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 26 Linear Block Codes Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay July 28, 2014 Binary Block Codes 3 / 26 Let F 2 be the set
More informationLecture 10: Synchronous Sequential Circuits Design
Lecture 0: Synchronous Sequential Circuits Design. General Form Input Combinational Flip-flops Combinational Output Circuit Circuit Clock.. Moore type has outputs dependent only on the state, e.g. ripple
More information4 An Introduction to Channel Coding and Decoding over BSC
4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the
More informationB. Cyclic Codes. Primitive polynomials are the generator polynomials of cyclic codes.
B. Cyclic Codes A cyclic code is a linear block code with the further property that a shift of a codeword results in another codeword. These are based on polynomials whose elements are coefficients from
More informationChapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005
Chapter 7 Error Control Coding Mikael Olofsson 2005 We have seen in Chapters 4 through 6 how digital modulation can be used to control error probabilities. This gives us a digital channel that in each
More informationTrellis Coded Modulation
Trellis Coded Modulation Trellis coded modulation (TCM) is a marriage between codes that live on trellises and signal designs We have already seen that trellises are the preferred way to view convolutional
More informationECE8771 Information Theory & Coding for Digital Communications Villanova University ECE Department Prof. Kevin M. Buckley Lecture Set 2 Block Codes
Kevin Buckley - 2010 109 ECE8771 Information Theory & Coding for Digital Communications Villanova University ECE Department Prof. Kevin M. Buckley Lecture Set 2 Block Codes m GF(2 ) adder m GF(2 ) multiplier
More informationFully-parallel linear error block coding and decoding a Boolean approach
Fully-parallel linear error block coding and decoding a Boolean approach Hermann Meuth, Hochschule Darmstadt Katrin Tschirpke, Hochschule Aschaffenburg 8th International Workshop on Boolean Problems, 28
More informationDr. Cathy Liu Dr. Michael Steinberger. A Brief Tour of FEC for Serial Link Systems
Prof. Shu Lin Dr. Cathy Liu Dr. Michael Steinberger U.C.Davis Avago SiSoft A Brief Tour of FEC for Serial Link Systems Outline Introduction Finite Fields and Vector Spaces Linear Block Codes Cyclic Codes
More informationLecture 3: Error Correcting Codes
CS 880: Pseudorandomness and Derandomization 1/30/2013 Lecture 3: Error Correcting Codes Instructors: Holger Dell and Dieter van Melkebeek Scribe: Xi Wu In this lecture we review some background on error
More informationSIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I
SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road 517583 QUESTION BANK (DESCRIPTIVE) Subject with Code : CODING THEORY & TECHNIQUES(16EC3810) Course & Branch: M.Tech - DECS
More informationChapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code
Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average
More informationMT5821 Advanced Combinatorics
MT5821 Advanced Combinatorics 1 Error-correcting codes In this section of the notes, we have a quick look at coding theory. After a motivating introduction, we discuss the weight enumerator of a code,
More informationRoll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70
Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks
More informationReed-Solomon code. P(n + 2k)
Reed-Solomon code. Problem: Communicate n packets m 1,...,m n on noisy channel that corrupts k packets. Reed-Solomon Code: 1. Make a polynomial, P(x) of degree n 1, that encodes message: coefficients,
More informationCSE 200 Lecture Notes Turing machine vs. RAM machine vs. circuits
CSE 200 Lecture Notes Turing machine vs. RAM machine vs. circuits Chris Calabro January 13, 2016 1 RAM model There are many possible, roughly equivalent RAM models. Below we will define one in the fashion
More informationRun-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE
General e Image Coder Structure Motion Video x(s 1,s 2,t) or x(s 1,s 2 ) Natural Image Sampling A form of data compression; usually lossless, but can be lossy Redundancy Removal Lossless compression: predictive
More information1 Introduction to information theory
1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through
More informationLecture 6: Expander Codes
CS369E: Expanders May 2 & 9, 2005 Lecturer: Prahladh Harsha Lecture 6: Expander Codes Scribe: Hovav Shacham In today s lecture, we will discuss the application of expander graphs to error-correcting codes.
More informationMATH3302. Coding and Cryptography. Coding Theory
MATH3302 Coding and Cryptography Coding Theory 2010 Contents 1 Introduction to coding theory 2 1.1 Introduction.......................................... 2 1.2 Basic definitions and assumptions..............................
More information9 Forward-backward algorithm, sum-product on factor graphs
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous
More informationTurbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus
Turbo Compression Andrej Rikovsky, Advisor: Pavol Hanus Abstract Turbo codes which performs very close to channel capacity in channel coding can be also used to obtain very efficient source coding schemes.
More informationTurbo Codes. Manjunatha. P. Professor Dept. of ECE. June 29, J.N.N. College of Engineering, Shimoga.
Turbo Codes Manjunatha. P manjup.jnnce@gmail.com Professor Dept. of ECE J.N.N. College of Engineering, Shimoga June 29, 2013 [1, 2, 3, 4, 5, 6] Note: Slides are prepared to use in class room purpose, may
More informationCombinational Logic. By : Ali Mustafa
Combinational Logic By : Ali Mustafa Contents Adder Subtractor Multiplier Comparator Decoder Encoder Multiplexer How to Analyze any combinational circuit like this? Analysis Procedure To obtain the output
More informationCommunication Theory II
Communication Theory II Lecture 24: Error Correction Techniques Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt May 14 th, 2015 1 Error Correction Techniques olinear Block Code Cyclic
More informationContents. A Finite-Field Algebra Review 254 A.1 Groups, Rings, and Fields
Contents 10 System Design with Codes 147 10.1 Convolutional Codes and Implementation........................... 148 10.1.1 Notation and Terminology................................ 148 10.1.2 The Convolutional
More informationA Hyper-Trellis based Turbo Decoder for Wyner-Ziv Video Coding
A Hyper-Trellis based Turbo Decoder for Wyner-Ziv Video Coding Arun Avudainayagam, John M. Shea and Dapeng Wu Wireless Information Networking Group (WING) Department of Electrical and Computer Engineering
More informationExact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel
Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel Brian M. Kurkoski, Paul H. Siegel, and Jack K. Wolf Department of Electrical and Computer Engineering
More informationEntropy as a measure of surprise
Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify
More informationSymbol Interleaved Parallel Concatenated Trellis Coded Modulation
Symbol Interleaved Parallel Concatenated Trellis Coded Modulation Christine Fragouli and Richard D. Wesel Electrical Engineering Department University of California, Los Angeles christin@ee.ucla. edu,
More informationUNIT I INFORMATION THEORY. I k log 2
UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper
More informationLecture 2 Linear Codes
Lecture 2 Linear Codes 2.1. Linear Codes From now on we want to identify the alphabet Σ with a finite field F q. For general codes, introduced in the last section, the description is hard. For a code of
More informationChapter 5: Data Compression
Chapter 5: Data Compression Definition. A source code C for a random variable X is a mapping from the range of X to the set of finite length strings of symbols from a D-ary alphabet. ˆX: source alphabet,
More informationSynchronous Sequential Circuit
Synchronous Sequential Circuit The change of internal state occurs in response to the synchronized clock pulses. Data are read during the clock pulse (e.g. rising-edge triggered) It is supposed to wait
More informationSIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding
SIGNAL COMPRESSION Lecture 3 4.9.2007 Shannon-Fano-Elias Codes and Arithmetic Coding 1 Shannon-Fano-Elias Coding We discuss how to encode the symbols {a 1, a 2,..., a m }, knowing their probabilities,
More informationRate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding
Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding Aline Roumy, Khaled Lajnef, Christine Guillemot 1 INRIA- Rennes, Campus de Beaulieu, 35042 Rennes Cedex, France Email: aroumy@irisa.fr Abstract
More informationCyclic Redundancy Check Codes
Cyclic Redundancy Check Codes Lectures No. 17 and 18 Dr. Aoife Moloney School of Electronics and Communications Dublin Institute of Technology Overview These lectures will look at the following: Cyclic
More informationECE 4450:427/527 - Computer Networks Spring 2017
ECE 4450:427/527 - Computer Networks Spring 2017 Dr. Nghi Tran Department of Electrical & Computer Engineering Lecture 5.2: Error Detection & Correction Dr. Nghi Tran (ECE-University of Akron) ECE 4450:427/527
More informationChapter 2. Digital Logic Basics
Chapter 2 Digital Logic Basics 1 2 Chapter 2 2 1 Implementation using NND gates: We can write the XOR logical expression B + B using double negation as B+ B = B+B = B B From this logical expression, we
More informationPolar Code Construction for List Decoding
1 Polar Code Construction for List Decoding Peihong Yuan, Tobias Prinz, Georg Böcherer arxiv:1707.09753v1 [cs.it] 31 Jul 2017 Abstract A heuristic construction of polar codes for successive cancellation
More informationSample Test Paper - I
Scheme G Sample Test Paper - I Course Name : Computer Engineering Group Marks : 25 Hours: 1 Hrs. Q.1) Attempt any THREE: 09 Marks a) Define i) Propagation delay ii) Fan-in iii) Fan-out b) Convert the following:
More informationarxiv: v1 [quant-ph] 29 Apr 2010
Minimal memory requirements for pearl necklace encoders of quantum convolutional codes arxiv:004.579v [quant-ph] 29 Apr 200 Monireh Houshmand and Saied Hosseini-Khayat Department of Electrical Engineering,
More information6.1.1 What is channel coding and why do we use it?
Chapter 6 Channel Coding 6.1 Introduction 6.1.1 What is channel coding and why do we use it? Channel coding is the art of adding redundancy to a message in order to make it more robust against noise. It
More informationError Detection and Correction: Hamming Code; Reed-Muller Code
Error Detection and Correction: Hamming Code; Reed-Muller Code Greg Plaxton Theory in Programming Practice, Spring 2005 Department of Computer Science University of Texas at Austin Hamming Code: Motivation
More informationCOMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei
COMPSCI 650 Applied Information Theory Apr 5, 2016 Lecture 18 Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei 1 Correcting Errors in Linear Codes Suppose someone is to send
More information16.36 Communication Systems Engineering
MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication
More informationError Detection & Correction
Error Detection & Correction Error detection & correction noisy channels techniques in networking error detection error detection capability retransmition error correction reconstruction checksums redundancy
More information