Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Similar documents

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Name :. Roll No. :... Invigilator s Signature :.. CS/B.TECH (NEW)(CSE/IT)/SEM-4/M-401/ MATHEMATICS - III

Physical Layer and Coding

Channel Coding I. Exercises SS 2017

UNIT I INFORMATION THEORY. I k log 2

16.36 Communication Systems Engineering

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

EE 229B ERROR CONTROL CODING Spring 2005

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

MATH32031: Coding Theory Part 15: Summary

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

Chapter 7: Channel coding:convolutional codes

Introduction to Convolutional Codes, Part 1

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Error Correction Methods

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Roll No. :... Invigilator s Signature :.. CS/B.Tech (EE-N)/SEM-6/EC-611/ DIGITAL SIGNAL PROCESSING. Time Allotted : 3 Hours Full Marks : 70

Cyclic Redundancy Check Codes

Digital Communication Systems ECS 452. Asst. Prof. Dr. Prapun Suksompong 5.2 Binary Convolutional Codes

CSCI 2570 Introduction to Nanocomputing

Solutions to problems from Chapter 3

Lecture 12. Block Diagram

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Channel Coding I. Exercises SS 2017

Communications II Lecture 9: Error Correction Coding. Professor Kin K. Leung EEE and Computing Departments Imperial College London Copyright reserved

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

The extended coset leader weight enumerator

Channel Coding and Interleaving

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Capacity of a channel Shannon s second theorem. Information Theory 1/33

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Code design: Computer search

Answers and Solutions to (Even Numbered) Suggested Exercises in Sections of Grimaldi s Discrete and Combinatorial Mathematics

MATH Examination for the Module MATH-3152 (May 2009) Coding Theory. Time allowed: 2 hours. S = q

On Compression Encrypted Data part 2. Prof. Ja-Ling Wu The Graduate Institute of Networking and Multimedia National Taiwan University

Communication Theory II

Error Correction and Trellis Coding

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

Exercise 1. = P(y a 1)P(a 1 )

Fault Tolerant Computing CS 530 Information redundancy: Coding theory. Yashwant K. Malaiya Colorado State University

3F1 Information Theory, Lecture 3

A 2-error Correcting Code

Distributed Source Coding Using LDPC Codes

Revision of Lecture 5

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Optical Storage Technology. Error Correction

The sequential decoding metric for detection in sensor networks

Public Key Algorithms

6.02 Fall 2012 Lecture #1

Communication Theory II

Chapter 3 Linear Block Codes

Error Correction Review

Welcome to Comp 411! 2) Course Objectives. 1) Course Mechanics. 3) Information. I thought this course was called Computer Organization

ECEN 604: Channel Coding for Communications

Coding on a Trellis: Convolutional Codes

Cyclic Codes. Saravanan Vijayakumaran August 26, Department of Electrical Engineering Indian Institute of Technology Bombay

SOFT DECISION FANO DECODING OF BLOCK CODES OVER DISCRETE MEMORYLESS CHANNEL USING TREE DIAGRAM

Channel Coding for Secure Transmissions

Channel Coding 1. Sportturm (SpT), Room: C3165

Dr. Cathy Liu Dr. Michael Steinberger. A Brief Tour of FEC for Serial Link Systems

THIS paper is aimed at designing efficient decoding algorithms

5 Mutual Information and Channel Capacity

Introduction to algebraic codings Lecture Notes for MTH 416 Fall Ulrich Meierfrankenfeld

B. Cyclic Codes. Primitive polynomials are the generator polynomials of cyclic codes.

Channel Coding I. Exercises SS 2016

Binary Convolutional Codes

3F1 Information Theory, Lecture 3

Chapter 2 Review of Classical Information Theory

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Roll No. : Invigilator's Signature : BIO-STATISTICS. Time Allotted : 3 Hours Full Marks : 70

Chapter 9 Fundamental Limits in Information Theory

Information Redundancy: Coding

3F1 Information Theory, Lecture 1

MT5821 Advanced Combinatorics

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

x n k m(x) ) Codewords can be characterized by (and errors detected by): c(x) mod g(x) = 0 c(x)h(x) = 0 mod (x n 1)

Cryptography and Security Midterm Exam

6.02 Fall 2011 Lecture #9

The Method of Types and Its Application to Information Hiding

Lecture 8: Shannon s Noise Models

3. Coding theory 3.1. Basic concepts

Reed-Solomon codes. Chapter Linear codes over finite fields

DEPARTMENT OF EECS MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 6.02: Digital Communication Systems, Fall Quiz I. October 11, 2012

Lecture 2. Capacity of the Gaussian channel

ECE Information theory Final

Shannon's Theory of Communication

MATH 433 Applied Algebra Lecture 21: Linear codes (continued). Classification of groups.

Research on Unequal Error Protection with Punctured Turbo Codes in JPEG Image Transmission System

Error-Correction Coding for Digital Communications

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Error Control Codes for Memories

Name :. Roll No. :... Invigilator s Signature :.. CS/B.TECH (CE-NEW)/SEM-3/CE-301/ SOLID MECHANICS

Optimum Soft Decision Decoding of Linear Block Codes

Cryptography. Number Theory with AN INTRODUCTION TO. James S. Kraft. Lawrence C. Washington. CRC Press

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel

Transcription:

Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks Candidates are required to give their answers in their own words as far as practicable. GROUP A ( Multiple Choice Type Questions ) 1. Choose the correct alternat ves for any ten of the following : 10 1 = 10 i) A ( 7, 4 ) linear block code has a code rate of a) 7 b) 4 c) 1 75 d) 0 571. ii) Entropy represents a) amount of information b) rate of information c) measure of uncertainty d) probability of message. 7203 [ Turn over

iii) The channel capacity is a measure of a) entropy rate b) maximum rate of information a channel can handle c) information contents of messages transmitted in a channel d) none of these. iv) The Hamming distance between V = 1100001011 and W = 1001101001 is a) 1 b) 5 c) 3 d) 4. v) An encoder for a ( 4, 3, 5 ) convolution code has a memory order of a) 4 b) 2 c) 3 d) 5. 7203 2

vi) Which of the following expressions is incorrect? a) H ( y/x ) = H ( x, y ) H ( x ) b) I ( x, y ) = H ( x ) H ( y/x ) vii) viii) c) H ( x/y ) = H ( x, y ) + H ( y ) d) I ( x, y ) = H ( y ) H ( y/x ). A polynomial is called monic if a) odd terms are unity b) even terms are unity c) leading coefficient is unity d) leading coefficient is zero. Which of the following techniques is used for Viterbi algorithm for decoding? a) Code tree b) Trellis c) State diagram d) Parity generator. 7203 3 [ Turn over

ix) The generator polynomial of a cyclic code is a factor of a) x n + 1 b) x n + 1 + 1 c) x n + 2 + 1 d) none of these. x) Consider the parity check matrix H = xi) 100 010 001 110 011 101 and the received vector r = ( 001110 ). Then the syndrome is given by a) ( 110 ) b) ( 100 ) c) ( 111 ) d) ( 101 ). For a ( 7, 4 ) cyclic code generated by g ( X ) = 1 X + X 3 the syndrome for the error pattern e ( X ) = X 3 is a) 101 b) 111 c) 110 d) 011. xii) The number of undetectable errors for a ( n, k ) linear code is a) 2 n k b) 2 n c) 2 n 2 k d) 2 k. 7203 4

GROUP B ( Short Answer Type Questions ) Answer any three of the following. 3 5 = 15 2. a) Differentiate between block cipher and stream cipher. 2 b) What do you mean by symmetric key and asymmetric key cryptography? What is 'Man-in-the middle' attack? 2 + 1 3. A ( 8, 4 ) cyclic code is generated by g X ) = 1 + X + X 4. Find the generator and parity-check matrix in systematic form. 3 + 2 4. a) What is the systematic structure of a code word? 1 b) What is syndrome and what is its significance? Draw the syndrome circuit for a ( 7, 4 ) linear block code with parity-check matrix H = 1 0 0 1 0 1 1 0 1 0 1 1 1 0. 2 + 2 0 0 1 0 1 1 1 5. For a ( 2, 1, 3 ) convolutional encoder the generator sequences are g 0 = ( 1000 ) and g (1) = ( 1101 ). 6. Determine the generator polynomial of a double error correcting BCH code of block length, n = 15. 7203 5 [ Turn over

GROUP C ( Long Answer Type Questions ) Answer any three of the following. 3 15 = 45 7. Consider a systematic ( 8, 4 ) code with parity check equations V 0 = U 0 + U 1 + U 2 V 1 = U 1 + U 2 + U 3 V 2 = U 0 + U 1 + U 3 V 3 = U 0 + U 2 + U 3 where U 0, U 1, U 2 and U 3 are message V 0, V 1, V 2 and V 3 are parity check digit i) Find the generator matrix and the parity check matrix for this code. ii) iii) iv) Find the minimum weight for this code. Find the er or detecting and the error correcting capability of th s code. Show thro gh an example that the code can detect three errors in code word. 6 + 4 + 4 + 1 8. a) Sta e and prove the Shannon-Hartley law of channel capacity. 1 + 5 b) A Gaussian channel has a 1 MHz bandwidth. If the signal power-to-noise power spectral density Error! c) Show that H ( X, Y ) = H ( X/Y ) + H ( Y ). 4 7203 6

9. a) Show that C = { 0000, 1100, 0011, 1111 } is a linear code. What is its minimum distance? 4 + 1 b) A ( 7, 3 ) linear code has the following generator matrix : G = 1 1 1 0 1 0 0 0 1 1 1 0 1 0 1 1 0 1 0 0 1 Determine a systematic form of G. Hence find the parity-check matrix H for the code. 3 + 2 c) Design the encoder circuit for the above code. 5 10. a) Write down the advantages of Huffman coding over Shannon-Fano coding. b) A discrete memoryless source has seven symbols x 1, x 2, x 3, x 4, x 5 x 6 and x 7 with probabilities of occurrence P ( x 1 ) = 0 05, P ( x 2 ) = 0 15, P ( x 3 ) = 0 2, P ( x 4 ) = 0 05 P ( x 5 ) = 0 15, P ( x 6 ) = 0 3 and P ( x 7 ) 0 1. Construct the Huffman code and determine i) En ropy ii) Average code length iii) Code efficiency. 3 + 5 + 3 + 3 + 1 11. a) What are the functions of P-box and S-box in case of DES algorithm? b) Explain the Diffy-Hellman key exchange algorithm. c) What do you mean by Quantum Cryptography? 4 + 9 + 2 7203 7 [ Turn over

12. Write short notes on any three of the following : 3 5 a) Shannon-Fano algorithm b) Advanced version of DES c) RSA algorithm d) Hamming coding e) Viterbi algorithm. 7203 8