NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

Similar documents
1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

Introduction to Convolutional Codes, Part 1

Code design: Computer search

Binary Convolutional Codes

Chapter 7: Channel coding:convolutional codes

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

Turbo Codes for Deep-Space Communications

ELEC 405/511 Error Control Coding. Binary Convolutional Codes

Simplified Implementation of the MAP Decoder. Shouvik Ganguly. ECE 259B Final Project Presentation

Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

Convolutional Codes. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 6th, 2008

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Introduction to Binary Convolutional Codes [1]

SOFT DECISION FANO DECODING OF BLOCK CODES OVER DISCRETE MEMORYLESS CHANNEL USING TREE DIAGRAM

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Coding on a Trellis: Convolutional Codes

Chapter10 Convolutional Codes. Dr. Chih-Peng Li ( 李 )

Physical Layer and Coding

Solutions or answers to Final exam in Error Control Coding, October 24, G eqv = ( 1+D, 1+D + D 2)

New Puncturing Pattern for Bad Interleavers in Turbo-Codes

Turbo Codes are Low Density Parity Check Codes

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

Channel Coding I. Exercises SS 2017

Convolutional Codes ddd, Houshou Chen. May 28, 2012

Digital Communications

An Introduction to Low Density Parity Check (LDPC) Codes

6.451 Principles of Digital Communication II Wednesday, May 4, 2005 MIT, Spring 2005 Handout #22. Problem Set 9 Solutions

Iterative Timing Recovery

Lecture 3 : Introduction to Binary Convolutional Codes

Channel Coding and Interleaving

RADIO SYSTEMS ETIN15. Lecture no: Equalization. Ove Edfors, Department of Electrical and Information Technology

EE 229B ERROR CONTROL CODING Spring 2005

Detection and Estimation Theory

Soft-Output Trellis Waveform Coding

Decision-Point Signal to Noise Ratio (SNR)

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

The Viterbi Algorithm EECS 869: Error Control Coding Fall 2009

Appendix D: Basics of convolutional codes

Example of Convolutional Codec


The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks

CHAPTER 8 Viterbi Decoding of Convolutional Codes

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

Trellis-based Detection Techniques

Factor Graphs and Message Passing Algorithms Part 1: Introduction

Communication Theory II

CoherentDetectionof OFDM

Error Correction and Trellis Coding

Lecture 4: Linear Codes. Copyright G. Caire 88

UNIT I INFORMATION THEORY. I k log 2

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Lecture 4 : Introduction to Low-density Parity-check Codes

10. Hidden Markov Models (HMM) for Speech Processing. (some slides taken from Glass and Zue course)

9 Forward-backward algorithm, sum-product on factor graphs

5. Density evolution. Density evolution 5-1

Lecture 12. Block Diagram

Information Theoretic Imaging

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL AND COMPUTER ENGINEERING Final Examination - Fall 2015 EE 4601: Communication Systems

AALTO UNIVERSITY School of Electrical Engineering. Sergio Damian Lembo MODELING BLER PERFORMANCE OF PUNCTURED TURBO CODES

Convolutional Codes Klaus von der Heide

Expectation propagation for signal detection in flat-fading channels

Delay, feedback, and the price of ignorance

STA 4273H: Statistical Machine Learning

arxiv:cs/ v2 [cs.it] 1 Oct 2006

Channel Coding I. Exercises SS 2016

Quasi-cyclic Low Density Parity Check codes with high girth

CSCI 2570 Introduction to Nanocomputing

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

OPTIMAL FINITE ALPHABET SOURCES OVER PARTIAL RESPONSE CHANNELS. A Thesis DEEPAK KUMAR

STA 414/2104: Machine Learning

16.36 Communication Systems Engineering

Non-Linear Turbo Codes for Interleaver-Division Multiple Access on the OR Channel.

Coding theory: Applications

Belief-Propagation Decoding of LDPC Codes

CprE 281: Digital Logic

Introduction to convolutional codes

ECE 534 Information Theory - Midterm 2

Convolutional Coding LECTURE Overview

University of Illinois ECE 313: Final Exam Fall 2014

COS 341: Discrete Mathematics

One-Bit LDPC Message Passing Decoding Based on Maximization of Mutual Information

Electrical Engineering Written PhD Qualifier Exam Spring 2014

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.

New reduced state space BCJR algorithms for the ISI channel

CS711008Z Algorithm Design and Analysis

6.02 Fall 2012 Lecture #1

p(d θ ) l(θ ) 1.2 x x x

One Lesson of Information Theory

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

ECEN 655: Advanced Channel Coding

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

ECEn 370 Introduction to Probability

Iterative Quantization. Using Codes On Graphs

New Designs for Bit-Interleaved Coded Modulation with Hard-Decision Feedback Iterative Decoding

Error Correction Methods

Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson

Transcription:

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR Sp ' 00 May 3 OPEN BOOK exam (students are permitted to bring in textbooks, handwritten notes, lecture notes etc into the exam room). calculators are permitted 120 min. exam notation is as per my lecture notes Answer all questions Show all intermediate steps clearly this applies to all questions 8 questions, 15 pages, maximum score = 50 points. Good luck! 1

SCORE Grade on Final Exam Grade on Course 1. A rate 1 3 turbo code is able to provide a bit error probability of10;6 at an E b =N 0 of 0:5 db. If we agree to treat a bit error probability of10 ;6 as providing reliable communication, how many dbaway is this from the minimum E b =N 0 required to achieve reliable communication (as given by the channel capacity formula)? The channel in this case is the discrete additive white Gaussian noise channel modeled by r i = s i + n i i 0 where r i is the ith received signal, s i the ith transmitted signal and fn i g is a sequence of independent, identically distributed Gaussian random variables (the noise) having zero mean and variance N 0 =2. 4points 2

2. What is the nominal coding gain of a rate 1 turbo code having minimum free distance 3 d free = 8? (Communication is over the same discrete additive white Gaussian noise channel described in the previous question). 2points 3

3. Consider the convolutional code encoded by polynomial generator matrix G(D) = " 1+D 1+D2 + D 3 D 2 1 1+D 1 # : (a) What is the external degree of the convolutional encoder? (b) What is the internal degree of the convolutional encoder? 2points 2points (c) What can you say concerning the minimum number of delay elements required to implement the encoder? 2points 4

4. Does the the convolutional encoder having polynomial generator matrix G(D) = " 1+D 1+D2 + D 3 D 2 1 1+D D # have catastrophic error propagation? 4points 5

5. Consider the convolutional code having polynomial generator matrix G(D) = h D 1+D i : (a) Draw the state and trellis diagrams (out to the 4th node level, starting from node level 0) of this code. As in class, use dotted lines to represent a message symbol that is 1 and a solid line otherwise. Label each branch of the trellis with the corresponding code symbols. 4points (b) Determine the generating function A F (L I D) of the code (notation as in class). 4points (c) What is the free distance d free of the convolutional code? Explain how you determined d free. 2points 6

Extra workspace 7

6. Determine an upper bound to the bit error probability P be of the rate 1 convolutional code 2 having generating function A F (L =1 I D) = I 2 D 6 + I 3 D 5 ; I 4 D 6 1 ; (2ID ; I 2 D 2 + D 2 ) The channel in this instance is a binary symmetric channel having cross-over probability = 0:001. 4points 8

Extra workspace 9

7. Draw a (recursive systematic) encoder for the rate 1 convolutional code having generator 2 matrix G(D) = h i 1+D 1 1+D+D 3 : 4points 10

8. The identical gures on the next 3 pages show the trellis diagram of a convolutional code having G(D) = h i 1+D + D 2 1+D in which the trellis which begins at node level 0 and has been terminated at node level t =4. The received signal over a binary symmetric channel having crossover probability <<1 is as shown below: f (r 1 (t) r 2 (t)) g = 01 11 11 11: (a) Use the BCJR algorithm to determine the most likely third bit u 2. (The message sequence symbols are u 0 u 1 u 2 u 3.) As intermediate steps: i. On the trellis marked (FORWARD) write in the appropriate locations the relevant k (s k ) (or scaled versions of these ) that you will need to implement the BCJR algorithm in this case 3points ii. On the trellis marked (BACKWARD) write in the appropriate locations the relevant k+1 (s k+1 ) (or scaled versions of these ) that you will need to implement the BCJR algorithm in this case 3points iii. On the trellis marked backward, also write in the relevant k (s k s k+1 ) (or scaled versions of these) that you will need 3points iv. Compute the likelihood ratio where r denotes the received vector. Pr(u k =1j r)=p r(u k =0j r): 3 points (b) Use the Viterbi algorithm and the trellis marked (Viterbi) to determine the most likely message sequence u 0 u 1 u 2 u 3. 4points 11

Use this page to determine and show FORWARD recursion parameters 12

Use this page to determine and show backward recursion parameters 13

Use this page to decode using the Viterbi algorithm 14

Extra workspace Enjoy your summer! 15