Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

Similar documents
Prob. 1 Prob. 2 Prob. 3 Total

Prob. 1 Prob. 2 Prob. 3 Total

Designing Information Devices and Systems I Fall 2015 Anant Sahai, Ali Niknejad Final Exam. Exam location: RSF Fieldhouse, Back Left, last SID 6, 8, 9

Lecture 7 September 24

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

Designing Information Devices and Systems I Fall 2017 Midterm 2. Exam Location: 150 Wheeler, Last Name: Nguyen - ZZZ

Discrete Mathematics and Probability Theory Summer 2015 Chung-Wei Lin Midterm 1

Designing Information Devices and Systems I Fall 2015 Anant Sahai, Ali Niknejad Midterm 1. Exam location: 60 Evans, last digit SID= 6

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

Designing Information Devices and Systems II Spring 2016 Anant Sahai and Michel Maharbiz Midterm 1. Exam location: 1 Pimentel (Odd SID + 61C)

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

Designing Information Devices and Systems I Spring 2019 Midterm 2. Exam Location: Cory 521 (DSP)

MTH 132 Solutions to Exam 2 Nov. 23rd 2015

MATH UN Midterm 2 November 10, 2016 (75 minutes)

EE 121: Introduction to Digital Communication Systems. 1. Consider the following discrete-time communication system. There are two equallly likely

CS 70 Discrete Mathematics and Probability Theory Spring 2018 Ayazifar and Rao Final

CS 170 Algorithms Spring 2009 David Wagner Final

Midterm Exam 1 Solution

Designing Information Devices and Systems I Fall 2015 Anant Sahai, Ali Niknejad Final Exam. Exam location: You re Done!!

Designing Information Devices and Systems II Spring 2016 Anant Sahai and Michel Maharbiz Midterm 2

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final

MTH 132 Solutions to Exam 2 Apr. 13th 2015

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Quiz 2 Date: Monday, November 21, 2016

Designing Information Devices and Systems I Spring 2018 Midterm 1. Exam Location: 155 Dwinelle Last Name: Cheng - Lazich

Designing Information Devices and Systems I Summer 2017 D. Aranki, F. Maksimovic, V. Swamy Midterm 2. Exam Location: 100 Genetics & Plant Bio

MA EXAM 2 INSTRUCTIONS VERSION 01 March 9, Section # and recitation time

Without fully opening the exam, check that you have pages 1 through 10.

Physics Grading Sheet.

Quiz 1 Date: Monday, October 17, 2016

CS 70 Discrete Mathematics and Probability Theory Summer 2016 Dinh, Psomas, and Ye Final Exam

Discrete Mathematics for CS Fall 2003 Wagner MT2 Soln

Business Statistics Midterm Exam Fall 2015 Russell. Please sign here to acknowledge

Math 51 Midterm 1 July 6, 2016

Math 51 First Exam October 19, 2017

EE 229B ERROR CONTROL CODING Spring 2005

Math 106: Calculus I, Spring 2018: Midterm Exam II Monday, April Give your name, TA and section number:

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

Designing Information Devices and Systems I Fall 2017 Midterm 1. Exam Location: 155 Dwinelle

Noisy-Channel Coding

MTH 234 Solutions to Exam 1 Feb. 22nd 2016

Practice Exam Winter 2018, CS 485/585 Crypto March 14, 2018


CS 70 Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand Midterm 1 Solutions

MTH 234 Exam 1 February 20th, Without fully opening the exam, check that you have pages 1 through 11.

Q1 Q2 Q3 Q4 Tot Letr Xtra

Test 2 Electrical Engineering Bachelor Module 8 Signal Processing and Communications

Machine Learning, Fall 2009: Midterm

2. Polynomials. 19 points. 3/3/3/3/3/4 Clearly indicate your correctly formatted answer: this is what is to be graded. No need to justify!

Name: Matriculation Number: Tutorial Group: A B C D E

Discrete Mathematics and Probability Theory Summer 2014 James Cook Midterm 1 (Version B)

CS 170 Algorithms Fall 2014 David Wagner MT2

CSE 546 Final Exam, Autumn 2013

MLC Practice Final Exam. Recitation Instructor: Page Points Score Total: 200.

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Efficient Algorithms and Intractable Problems Spring 2016 Alessandro Chiesa and Umesh Vazirani Midterm 2

MA FINAL EXAM Form 01 MAY 3, 2018

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Discrete Mathematics and Probability Theory Fall 2017 Ramchandran and Rao Midterm 2 Solutions

Math 31 Lesson Plan. Day 2: Sets; Binary Operations. Elizabeth Gillaspy. September 23, 2011


Discrete Mathematics and Probability Theory Summer 2015 Chung-Wei Lin Midterm 2

Name: Student ID: Instructions:

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL AND COMPUTER ENGINEERING Final Examination - Fall 2015 EE 4601: Communication Systems

Lecture 1: Shannon s Theorem

APPM 2350 Section Exam points Wednesday September 26, 6:00pm 7:30pm, 2018

Discrete Mathematics and Probability Theory Spring 2015 Vazirani Midterm #2 Solution

Math 115 Final Exam April 24, 2017

Midterm Exam, Spring 2005

6.02 Fall 2012 Lecture #1

Optimum Soft Decision Decoding of Linear Block Codes

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Midterm 1

No calculators, cell phones or any other electronic devices can be used on this exam. Clear your desk of everything excepts pens, pencils and erasers.

Midterm 1 for CS 170

STUDENT NAME: STUDENT SIGNATURE: STUDENT ID NUMBER: SECTION NUMBER RECITATION INSTRUCTOR:

University of Illinois ECE 313: Final Exam Fall 2014

Multiple Choice Answers. MA 110 Precalculus Spring 2016 Exam 1 9 February Question

MA EXAM 3 Form A November 12, You must use a #2 pencil on the mark sense sheet (answer sheet).

Without fully opening the exam, check that you have pages 1 through 11.

FINAL EXAM: FALL 2013 CS 6375 INSTRUCTOR: VIBHAV GOGATE

Lecture 12. Block Diagram


EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

Midterm Exam 2 (Solutions)

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

Class 7 Preclass Quiz on MasteringPhysics

Experiment 4 Free Fall

Lecture 11: Quantum Information III - Source Coding

Ex1 Ex2 Ex3 Ex4 Ex5 Ex6

Without fully opening the exam, check that you have pages 1 through 11.

MA Exam 1 Fall 2015 VERSION 01

Physics Grading Sheet.

Physics 101 Hour Exam 3 December 2, 2013

Lecture 10: Broadcast Channel and Superposition Coding

Midterm: CS 6375 Spring 2015 Solutions

Transcription:

EECS 121 Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1 PRINT your student ID: PRINT AND SIGN your name:, (last) (first) (signature) PRINT your Unix account login: ee121- Prob. 1 Prob. 2 Prob. 3 Total You may consult your one handwritten note sheet. (You must turn it in with your exam.) Phones, calculators, tablets, and computers are not permitted. No collaboration is allowed at all and you are not allowed to look at another s work. Please write your answers in the spaces provided in the test; in particular, we will not grade anything on another exam page unless we are clearly told in the problem space to look there. You have 80 minutes. There are 3 questions, of varying numbers of points. The questions and their parts are of varying difficulty, so avoid spending too long on any one part. 50 Points is a good score. This is basically a two hour exam but you only have one hour and twenty minutes. Do not turn this page until your instructor tells you to do so. EECS 121, Fall 2013, MT 1 1

Problem 1. [True or false] (10 points) Circle TRUE or FALSE. Prove statements that you think are true and disprove (e.g. by showing a counterexample) statements that you think are false. (a) TRUE or FALSE: Consider an iid sequence of Bernoulli-p random variables X i for p < 1 2. Suppose that we define the Typical Set Tε n = { x {0,1} n 2 n(h(p)+ε) P( X = x) 2 n(h(p) ε) } for ε > 0. Here 1 H(p) = plog 2 p + (1 p)log 1 2 1 p. Then the set Tε n contains the most likely binary sequence i.e. the n-length sequence x that maximizes P( X = x). EECS 121, Fall 2013, MT 1 2

Problem 2. Not quite orthogonal signaling (40 points) Consider a continuous-time channel over the time-interval [0, 1] with AWGN N(t) with intensity 1. (e.g. 1 0 N(t)dt is a standard Gaussian random variable with mean 0 and variance 1.) The signaling being used is a kind of pulse-position modulation where the potential pulses sometimes overlap with each other. There are 2 k possible pulses for messages corresponding to i = 0,1,...,2 k 1. x i (t) = 0 if t < i 2 k +1 P 2k +1 i 2 if 2 k +1 t i+2 2 k +1 0 if t > i+2 2 k +1 The decoding strategy will be the same general correllate and then threshold strategy. So Ỹ i = 1 0 Y (t) x i(t) P dt will be calculated and then compared against a threshold T to see which one passes. a. (4 points) Sketch what these waveforms look like for some generic i, i + 1, and i + 2. b. (5 points) Show that x i 2 = P. EECS 121, Fall 2013, MT 1 3

c. (6 points) Calculate x i,x j. Are they all orthogonal to each other? d. (20 points) Choose a (good) formula for a threshold T and argue why the probability of the true Ỹ m > T should approach 1 as k increases while the probability that all false Ỹ f < T should approach 1 as k increases. (HINT: Now there are two kinds of false codewords those that are not orthogonal to the true codeword and those that are. Deal with them separately.) (Second HINT: Feel free to guess at the threshold and move on to the next part. Come back to finish this later.) EECS 121, Fall 2013, MT 1 4

[Extra Page] EECS 121, Fall 2013, MT 1 5

e. (5 points) What is the energy per bit required in your scheme from part (c) for successful decoding? Comment. EECS 121, Fall 2013, MT 1 6

Problem 3. Random Affine Codes with a twist (35 points) In this problem, you are asked to consider a different approach to coming up with random affine codes. We are going to be working in a finite field F n of size 2 n. This means that there is an addition, additive inverses, multiplication, and multiplicative inverses, etc. For this problem, feel free to assume that all the properties you learned in 70 about finite fields of prime sizes continue to apply to this field F n so you can also invoke polynomial properties including properties of roots, degrees, interpolation, etc. as you would like. Furthermore, assume there is an invertible mapping g( x) that maps a n-bit binary string x into an element of this field and g 1 does the reverse. The messages d are k-bits long and can be interpreted as n-bit binary strings by simply putting zeros in front. So assume that you can apply g to messages d as well. (HINT: It might help to just ignore g throughout the problem and just believe that multiplication of these vectors is a magical operation that obeys finite field rules.) Consider a randomized coding strategy that works as follows: 1. Two elements of the finite field F n are drawn iid uniformly at random. Call these A and B. This is done at design time so that both the encoder and decoder know A and B. 2. n-bit codewords are generated for k-bit messages d using x( d) = g 1 (A g( d) + B) where the represents multiplication in F n and the + represents addition in F n. a. (5 points) Show that the codeword corresponding to a message d is drawn uniformly at random from all 2 n possible codewords. EECS 121, Fall 2013, MT 1 7

b. (10 points) Show that the codewords corresponding to different mesages are pairwise independent. (HINT: Feel free to skip this part and come back to it later. You won t need this part for the next parts.) EECS 121, Fall 2013, MT 1 8

c. (5 points) From parts (a) and (b), you can see that all the properties we required in the in-class random-coding analysis are satisfied and these random codes will work as well as those discussed in lecture. How many bits does it take to represent a particular code? (i.e. specify the A and B) d. (5 points) Further suppose that the mapping g has the following properties: g( 0) = 0, g( v + w) = g( v) + g( w) for all v and w. (i.e. It s as though the addition operation in F n operates bitwise.) Argue why in this case, for use over a binary symmetric channel, it suffices to just use linear codes g 1 (A g( d)). (A binary symmetric channel is the channel discussed in class that looks like additive iid Bernoulli-p noise in the binary field.) EECS 121, Fall 2013, MT 1 9

e. (10 points) Keeping the assumptions of part (d) above, are the resulting codes linear codes when viewed over binary vectors? (i.e. For every A does there exist a matrix G so that g 1 (A g( d)) = G d for all d?) (HINT: Can you interpret the columns of G? Then check that interpretation... ) EECS 121, Fall 2013, MT 1 10

[Extra Page] EECS 121, Fall 2013, MT 1 11