ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

Size: px
Start display at page:

Download "ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7"

Transcription

1 ECS 332: Principles of Communications 2012/1 HW 4 Due: Sep 7 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. The spectrum of a periodic square wave can be found using: % specsquare.m plot the spectrum of a square wave close all f0=10; % "(fundamental) frequency" of the square wave EndTime=2; % Will consider from time = 0 to EndTime Ts=1/1000; % sampling interval (time interval between samples) t=0:ts:(endtime Ts); % create a time vector x=sign(cos(2 pi f0 t)); % square wave = sign of cos wave plotspec(x,ts) % call plotspec to draw spectrum The output of specsquare.m is shown in Figure 4.1. The top plot shows the first 2 seconds of a square wave with fundamental frequency f 0 = 10 cycles per second. The bottom plot shows a series of spikes that define the frequency content. In this case, the largest spike occurs at ±10 Hz, followed by smaller spikes at all the odd-integer multiples (i.e., at ±30, ±50, ±70, etc.). Modify specsquare.m to investigate the relationship between the time behavior of the square wave and its spectrum. Try square waves with different (fundamental) frequencies: f 0 = 20, 40, 100, 300 Hz. (Keep T s = 1/1000.) Describe the aliasing effect on each of cases. How do the time plots change? How do the spectra change? 4-1

2 ECS 332 HW 4 Due: Sep / Seconds 1.5 Magnitude Frequency [Hz] Figure 4.1: Plots from specsquare.m Problem 2. Determine the Nyquist sampling rate and the Nyquist sampling interval for the signals: (a) sinc(100πt) (b) sinc 2 (100πt) (c) sinc(100πt) + sinc(50πt) (d) sinc(100πt) + 3 sinc 2 (60πt) (e) sinc(50πt) sinc(100πt) Remark: Recall that in our class, sinc(x) = sin(x) x. 4-2

3 ECS 332 HW 4 Due: Sep /1 Problem 3. Consider a signal g(t) = sinc(πt). (a) Sketch the Fourier transform G(f) of g(t). (b) Find the Nyquist sampling rate. (c) Recall that the instantaneous sampled signal g δ (t) is defined by g δ (t) = g(t) δ(t nt s ) where T s is the sampling interval. n= (i) Let T s = 0.5. Sketch the Fourier transform G δ (f) of g δ (t). (ii) Let T s = 4/3. Sketch the Fourier transform G δ (f) of g δ (t). (d) The sequence of sampled values g[n] is constructed from g(t) by Recall the reconstruction equation: g r (t) = g[n] = g (t) t=nts. n= g[n] sinc(πf s (t nt s )). Note that we write g r (t) instead of g(t) to accommodate the case that the sampling rate is too low; in which case, the reconstructed signal is not the same as g(t). (i) With T s = 1, i. Find g[n] for n =..., 4, 3, 2, 1, 0, 1, 2, 3, 4,... ii. Use the reconstruction equation to find g r (t). (ii) Let s test the reconstruction equation by using MATLAB to plot g r (t). Note that the sum in the reconstruction equation extends from to +. In MATLAB, we can not add that many terms. So, we need to stop at some n. In this part, use T s = 0.5. i. Use MATLAB to plot g r (t) when only n = 0 term is included. ii. Use MATLAB to plot g r (t) and all of its sinc components. Include only n = 1, 0, 1. iii. Use MATLAB to plot g r (t) and all of its sinc components. Include only n = 5, 4,..., 1, 0, 1,..., 4, 5. iv. Use MATLAB to plot g r (t) and all of its sinc components. Include only n = 10, 9,..., 1, 0, 1,..., 9, 10. In all these plots, consider t from -4 to 4. Also include the plot of sinc(πt) for comparison. 4-3

4 Magnitude Magnitude ECS 332: Solution for Problem Set 4 Problem 1: Aliasing and periodic square wave In the time domain, the switching between the values -1 and 1 should be faster as we increase f 0. In the frequency domain, we should get impulses (spikes) at all the odd-integer multiples of f 0 Hz. The center spikes (at f 0 ) should be the largest among them. All the plots below are adjusted so that they show 10 periods of the original signal in the time domain. From the plots, as we increase f 0 from 10 to 20 Hz, the locations of spikes changes from all the odd-integer multiples of 10 Hz to all the odd-integer multiples of 20 Hz. In particular, we see the spikes at 20, 60, 100, 140, 180, 220, 260, 300, 340, 380, 420, 460. Note that plotspec only plots from [-f s /2,f s /2). So, we see a spike at -500 but not 500. Of course, the Fourier transform of the sampled waveform is periodic and hence when we replicate the spectrum every f s, we will have a spike at 500. Note that in reality, we should also see spikes at 540, 580, 620, 660, and so on. However, because the sampling rate is 1000, these high frequency spikes will suffer from aliasing and fold back into our viewing window [-f s /2,f s /2). However, they fall back to the frequencies that already have spikes (for example, 540 will fold back to 460, and 580 will fold back to 420) and therefore the aliasing effect is not easily noticeable in the frequency domain. When f 0 = 40, we start to see the aliasing effect in the frequency domain. Instead of seeing spikes only at 40, 120, 200, 280, 360, 440, the spikes at higher frequencies (such as 520, 600, and so on) fold back to lower frequencies (such as 480, 400, and so on). The plot still looks OK in the time domain Seconds Seconds Frequency [Hz] Frequency [Hz]

5 Magnitude Magnitude Magnitude Magnitude At high fundamental frequency f 0 = 100, we see stronger effect of aliasing. In the time domain, the waveform does not look quite rectangular. In the frequency domain, we only see the spikes at 100, 300, and 500. These are at the correct locations. However, there are too few of them to reconstruct a square waveform. The rest of the spikes are beyond our viewing window. We can t see them directly because they fold back to the frequencies that are already occupied by the lower frequencies Seconds Seconds Frequency [Hz] Frequency [Hz] Our problem can be mitigated by reducing the sampling interval to T S = 1/1e4 instead of T S = 1/1e3. Now, the spikes show up again as shown by the plot on the right above. Finally, at the highest frequency f 0 = 300, if we still use T = 1/1e3, the waveform will be heavily distorted in the time domain. This is shown in the left plot below. We have large spikes at 300 as expected. However, the next pair which should occur at 900 is out of the viewing window and therefore fold back to 100. Again, the aliasing effect can be mitigated by reducing the sampling time to T = 1/1e4 instead of T = 1/1e3. Now, more spikes show up at their expected places. Note that we can still see a lot of small spikes scattered across the frequency domain. These are again the spikes from higher frequency which fold back to our viewing window Seconds Seconds Frequency [Hz] Frequency [Hz]

6 Q2 Nyquist sampling rate and Nyquist sampling interval Sunday, July 17, :09 PM ECS332 HW4 Sol Page 1

7 ECS332 HW4 Sol Page 2

8 Q3 Sinc Reconstruction of Sinc Thursday, August 30, :51 PM ECS332 HW4 Sol Page 3

9 t t t t ECS332 HW4 Sol Page 4

10 ECS332 HW4 Sol Page 5

11 ECS 332: Principles of Communications 2012/1 HW 5 Due: Sep 26 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. Consider a signal g(t) = sinc (3(t 5)). (a) Is g(t) time-limited? (b) Is g(t) band-limited? (c) Carefully sketch g(t) (d) Carefully sketch the magnitude G(f) of the Fourier transform G(f). Problem 2. State the reconstruction formula. Hint: You should be able to do this by recalling the reconstruction process. Problem 3. State the Nyquist s (first) criterion for zero ISI (a) In the time domain. (b) In the frequency domain. Problem 4. In each part below, a pulse P (f) is defined in the frequency domain from f = 0 to f = 1. Outside of [0, 1], you task is to assign value(s) to P (f) so that it becomes a Nyquist pulse. Of course, you will also need to specify the symbol interval T as well. Hint: To avoid dealing with complex-valued P (f), you may assume that p(t) is real-valued and even; in which case P (f) is also real-valued and even. (a) Find a Nyquist pulse P (f) whose P (f) = 0.5 on [0, 1]. 5-1

12 ECS 332 HW 5 Due: Sep /1 (b) Find a Nyquist pulse P (f) whose P (f) = 0.25 on [0, 1]. (c) Find a Nyquist pulse P (f) whose P (f) = { 0.5, 0 f < , 0.5 f 1 (d) Find a Nyquist pulse P (f) whose { 0.5, f [0, 0.25) [0.5, 0.75) P (f) = 0.25, f [0.25, 0.5) [0.75, 1] Problem 5. Consider a raised cosine pulse p(t) and its Fourier transform P (f). Assume the rolloff factor α = 0.3 and the symbol duration T = 1. (a) Carefully sketch P (f). (b) Find p(2). (c) Find P (0.5). (d) Find P (0.3). (e) *Find P (0.4). Remark: You should be able to solve this problem without referring to the ugly formula. Problem 6. Consider a raised cosine pulse p(t) with rolloff factor α and symbol duration T. (a) Find p(t/2) as a function of α. (b) Use MATLAB to plot p(t/2) as a function of α. 5-2

13 ECS332 HW 5 Sol Page 1 Q1 Sinc Review Monday, September 17, :09 PM

14 ECS332 HW 5 Sol Page 2 Q2 Reconstruction Formula Monday, September 17, :34 PM

15 Q3 Nyquist's Criterion Monday, September 17, :47 PM ECS332 HW 5 Sol Page 3

16 Q4 Nyquist Pulses Monday, September 17, :20 PM ECS332 HW 5 Sol Page 4

17 ECS332 HW 5 Sol Page 5

18 ECS332 HW 5 Sol Page 6

19 ECS332 HW 5 Sol Page 7 Q5 Raised Cosine Pulse Monday, September 17, :51 PM

20 ECS332 HW 5 Sol Page 8 Q6 Raised Cosine Pulse Monday, September 17, :38 PM

21 ECS 332: Principles of Communications 2012/1 HW 6 Due: Oct 5 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. Consider the code {0, 01} (a) Is it nonsingular? (b) Is it uniquely decodable? (c) Is it prefix-free? Problem 2. Consider the random variable X whose support S X contains seven values: S X = {x 1, x 2,..., x 7 }. Their corresponding probabilities are given by (a) Find the entropy H(X). x x 1 x 2 x 3 x 4 x 5 x 6 x 7 p X (x) (b) Find a binary Huffman code for X. (c) Find the expected codelength for the encoding in part (b). Problem 3. Find the entropy and the binary Huffman code for the random variable X with pmf p X (x) = x for x = 1, 2,..., Also calculate E [l(x)] when Huffman code is used. 6-1

22 ECS 332 HW 6 Due: Oct /1 Problem 4. Construct a random variable X (and its pmf) whose Huffman code is {0, 10, 11}. Problem 5. These codes cannot be Huffman codes. Why? (a) {00, 01, 10, 110} (b) {01, 10} Hint: Huffman code is optimal. Problem 6. A memoryless source emits two possible message Y(es) and N(o) with probability 0.9 and 0.1, respectively. (a) Determine the entropy (per source symbol) of this source. (b) Find the expected codeword length per symbol of the Huffman binary code for the third-order extensions of this source. (c) Use MATLAB to find the expected codeword length per symbol of the Huffman binary code for the fourth-order extensions of this source. (d) Use MATLAB to plot the expected codeword length per symbol of the Huffman binary code for the nth-order extensions of this source for n = 1, 2,...,

23 Q1 Classes of Codes Sunday, August 28, :33 PM ECS332 HW 6 Sol Page 1

24 Q2 Huffman Code Sunday, August 28, :38 PM ECS332 HW 6 Sol Page 2

25 Q3 Huffman code Sunday, August 28, :13 PM ECS332 HW 6 Sol Page 3

26 ECS332 HW 6 Sol Page 4 Q4 Inverse Huffman Problem Sunday, August 28, :07 PM

27 ECS332 HW 6 Sol Page 5

28 Q5 Non-Huffman Codes Thursday, September 27, :08 PM ECS332 HW 6 Sol Page 6

29 Q6 Source Extension Monday, August 29, :35 AM ECS332 HW 6 Sol Page 7

30 ECS332 HW 6 Sol Page 8

31 n ECS332 HW 6 Sol Page 9

32 ECS 332: Principles of Communications 2012/1 Lecturer: Prapun Suksompong, Ph.D. HW Solution 7 Due: N/A Problem 1. Optimal code lengths that require one bit above entropy: The source coding theorem says that the Huffman code for a random variable X has an expected length strictly less than H(X) + 1. Give an example of a random variable for which the expected length of the Huffman code is very close to H(X) + 1. Problem 2. Consider the AWGN channel: where N N (0, σn 2 ). Assume that X whole question, assume a > 0. = (a) In this part, use a = 5 and σ N = 3. Y = X + N N and that X takes two values, a and a. For the The channel output Y is fed into a decision device (a comparator) which will compare the value of Y to 0. The output ˆX of the decision (thresholding) device is determined by The whole system in shown in Figure 7.1. ˆX = { a, if Y 0, a, if Y < 0. X Y Decision device X N Figure 7.1: System for Q1.a Find [ ] (i) P ˆX = a X = a [ ] (ii) P ˆX = a X = a 7-1

33 ECS 332 HW Solution 7 Due: N/A 2012/1 [ ] (iii) P ˆX = a X = a [ ] (iv) P ˆX = a X = a [ ] (v) P ˆX X Hint: By the total probability theorem, [ ] [ ] [ ] P ˆX X = P ˆX X X = a P [X = a] + P ˆX X X = a P [X = a], [ ] [ ] = P ˆX a X = a P [X = a] + P ˆX a X = a P [X = a]. (b) Continue from part (a). We can use the system from part (a) to transmit/receive binary information by adding a simple mapping device with map 0 to a and 1 to a at the transmitter. At the receiver, we also have another mapping device that map the a and a back to 0 and 1, respectively. The new system is shown in Figure 7.2. S mapping X Y Decision X device mapping Z N Figure 7.2: System for Q1.b Note that S and Z are binary and that the whole system (inside the dotted box) in Figure 7.2 can be reduced to a binary symmetric channel (BSC) with crossover probability p. Find p. (c) Observe that as σ N increases, the value of p in part (b) will also increase. Is it possible to find σ N such that p > 0.5? (d) Express p using a, σ N, and the Q function. Problem 3. Consider a transmission over the BSC with crossover probability p. The random input to the BSC is denoted by S. Assume S bernoulli(p 1 ). Let Z be the output of the BSC. 7-2

34 ECS 332 HW Solution 7 Due: N/A 2012/1 (a) Suppose, at the receiver (which observes the output of the BSC), we learned that Z = 1. For each of the following scenarios, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? (Hint: Use Bayes theorem.) (i) Assume p = 0.3 and p 1 = 0.1 (ii) Assume p = 0.3 and p 1 = 0.5 (iii) Assume p = 0.3 and p 1 = 0.9 (iv) Assume p = 0.7 and p 1 = 0.5 (b) Suppose, at the receiver (which observes the output of the BSC), we learned that Z = 0. For each of the following scenarios, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? (i) Assume p = 0.3 and p 1 = 0.1 (ii) Assume p = 0.3 and p 1 = 0.5 (iii) Assume p = 0.3 and p 1 = 0.9 (iv) Assume p = 0.7 and p 1 = 0.5 Remark: A MAP (maximum a posteriori) detector is a detector that takes the observed value Z and then calculate the most likely transmitted value. More specifically, Ŝ MAP (z) = arg max P [S = s Z = z] s In fact, in part (a), each of your answers is ŜMAP (1) and in part (b), each of your answers is ŜMAP (0). Problem 4. Consider a repetition code with a code rate of 1/5. Assume that the code is used over a BSC with crossover probability p = 0.4. (a) Assume that the receiver uses majority vote to decode the transmitted bit. Find the probability of error. (b) Assume that the source produces source bit S with Suppose the receiver observes P [S = 0] = 1 P [S = 1] = (i) What is the probability that 0 was transmitted? (Do not forget that this is a conditional probability. The answer is not 0.4 because we have some extra information from the observed bits at the receiver.) 7-3

35 ECS 332 HW Solution 7 Due: N/A 2012/1 (ii) What is the probability that 1 was transmitted? (iii) Given the observed 01001, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? Does your answer agree with the majority voting rule for decoding? (c) Assume that the source produces source bit S with Suppose the receiver observes P [S = 0] = 1 P [S = 1] = p 0. (i) What is the probability that 0 was transmitted? (ii) What is the probability that 1 was transmitted? (iii) Given the observed 01001, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? Your answer may depend on the value of p 0. Does your answer agree with the majority voting rule for decoding? Problem 5. A channel encoder map blocks of two bits to five-bit (channel) codewords. The four possible codewords are 00000, 01000, 10001, and A codeword is transmitted over the BSC with crossover probability p = 0.1. Suppose the receiver observes at the output of the BSC. (a) Assume that all four codewords are equally likely to be transmitted. Given the observed at the receiver, what is the most likely codeword that was transmitted? (b) The Hamming distance between two binary vectors is defined as the number of positions at which the corresponding bits are different. For example, the Hamming distance between and is 2. The minimum distance decoder is defined as the decoder that compare the Hamming distances between the observed bits at the receiver and each of the possible codewords. The output of this decoder is the codeword that give the minimum distance. Explain why the minimum distance decoder would give the same decoded codeword as the decoder in part (a). (c) What is the minimum (Hamming) distance d min among codewords? (d) Assume that the four codewords are not equally likely. Suppose is transmitted more frequently with probability 0.7. The other three codewords are transmitted with probability 0.1 each. Given the observed at the receiver, what is the most likely codeword that was transmitted? 7-4

36 Q1 Monday, August 29, :20 AM ECS332 HW7 Sol Page 1

37 ECS332 HW7 Sol Page 2

38 Q2 Tuesday, September 20, :05 AM ECS332 HW7 Sol Page 3

39 ECS332 HW7 Sol Page 4

40 ECS332 HW7 Sol Page 5

41 ECS332 HW7 Sol Page 6 Q3 Tuesday, October 09, :37 AM

42 ECS332 HW7 Sol Page 7

43 Q4 Tuesday, October 09, :21 AM ECS332 HW7 Sol Page 8

44 ECS332 HW7 Sol Page 9

45 ECS332 HW7 Sol Page 10 Q5 Tuesday, October 09, :50 AM

46 ECS332 HW7 Sol Page 11

47 ECS332 HW7 Sol Page 12

ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5

ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5 ECS 452: Digital Communication Systems 2015/2 HW 1 Due: Feb 5 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Must solve all non-optional problems. (5 pt) (i) Write your first name and the

More information

HW Solution 3 Due: July 15

HW Solution 3 Due: July 15 ECS 315: Probability and Random Processes 2010/1 HW Solution 3 Due: July 15 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) A part of ONE question will be graded. Of course, you do not know which problem

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Midterm, Tuesday February 10th Instructions: You have two hours, 7PM - 9PM The exam has 3 questions, totaling 100 points. Please start answering each question on a new page

More information

EE5713 : Advanced Digital Communications

EE5713 : Advanced Digital Communications EE5713 : Advanced Digital Communications Week 12, 13: Inter Symbol Interference (ISI) Nyquist Criteria for ISI Pulse Shaping and Raised-Cosine Filter Eye Pattern Equalization (On Board) 20-May-15 Muhammad

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Principles of Communications

Principles of Communications Principles of Communications Chapter V: Representation and Transmission of Baseband Digital Signal Yongchao Wang Email: ychwang@mail.xidian.edu.cn Xidian University State Key Lab. on ISN November 18, 2012

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

HW 13 Due: Dec 6, 5 PM

HW 13 Due: Dec 6, 5 PM ECS 315: Probability and Random Processes 2016/1 HW 13 Due: Dec 6, 5 PM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) This assignment has 8 pages. (b) (1 pt) Write your first name and the last three

More information

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average

More information

ECS332: Midterm Examination (Set I) Seat

ECS332: Midterm Examination (Set I) Seat Sirindhorn International Institute of Technology Thammasat University at Rangsit School of Information, Computer and Communication Technology ECS33: Midterm Examination (Set I) COURSE : ECS33 (Principles

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code Chapter 2 Date Compression: Source Coding 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code 2.1 An Introduction to Source Coding Source coding can be seen as an efficient way

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

Digital Baseband Systems. Reference: Digital Communications John G. Proakis

Digital Baseband Systems. Reference: Digital Communications John G. Proakis Digital Baseband Systems Reference: Digital Communications John G. Proais Baseband Pulse Transmission Baseband digital signals - signals whose spectrum extend down to or near zero frequency. Model of the

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information

Square Root Raised Cosine Filter

Square Root Raised Cosine Filter Wireless Information Transmission System Lab. Square Root Raised Cosine Filter Institute of Communications Engineering National Sun Yat-sen University Introduction We consider the problem of signal design

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

10-704: Information Processing and Learning Fall Lecture 10: Oct 3 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of

More information

Coding theory: Applications

Coding theory: Applications INF 244 a) Textbook: Lin and Costello b) Lectures (Tu+Th 12.15-14) covering roughly Chapters 1,9-12, and 14-18 c) Weekly exercises: For your convenience d) Mandatory problem: Programming project (counts

More information

Digital Communications

Digital Communications Digital Communications Chapter 9 Digital Communications Through Band-Limited Channels Po-Ning Chen, Professor Institute of Communications Engineering National Chiao-Tung University, Taiwan Digital Communications:

More information

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

UTA EE5362 PhD Diagnosis Exam (Spring 2011) EE5362 Spring 2 PhD Diagnosis Exam ID: UTA EE5362 PhD Diagnosis Exam (Spring 2) Instructions: Verify that your exam contains pages (including the cover shee. Some space is provided for you to show your

More information

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1

More information

Principles of Communications Lecture 8: Baseband Communication Systems. Chih-Wei Liu 劉志尉 National Chiao Tung University

Principles of Communications Lecture 8: Baseband Communication Systems. Chih-Wei Liu 劉志尉 National Chiao Tung University Principles of Communications Lecture 8: Baseband Communication Systems Chih-Wei Liu 劉志尉 National Chiao Tung University cwliu@twins.ee.nctu.edu.tw Outlines Introduction Line codes Effects of filtering Pulse

More information

2016 Spring: The Final Exam of Digital Communications

2016 Spring: The Final Exam of Digital Communications 2016 Spring: The Final Exam of Digital Communications The total number of points is 131. 1. Image of Transmitter Transmitter L 1 θ v 1 As shown in the figure above, a car is receiving a signal from a remote

More information

Lecture 1: Shannon s Theorem

Lecture 1: Shannon s Theorem Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work

More information

Pulse Shaping and ISI (Proakis: chapter 10.1, 10.3) EEE3012 Spring 2018

Pulse Shaping and ISI (Proakis: chapter 10.1, 10.3) EEE3012 Spring 2018 Pulse Shaping and ISI (Proakis: chapter 10.1, 10.3) EEE3012 Spring 2018 Digital Communication System Introduction Bandlimited channels distort signals the result is smeared pulses intersymol interference

More information

Intro to Information Theory

Intro to Information Theory Intro to Information Theory Math Circle February 11, 2018 1. Random variables Let us review discrete random variables and some notation. A random variable X takes value a A with probability P (a) 0. Here

More information

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL AND COMPUTER ENGINEERING Final Examination - Fall 2015 EE 4601: Communication Systems

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL AND COMPUTER ENGINEERING Final Examination - Fall 2015 EE 4601: Communication Systems GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL AND COMPUTER ENGINEERING Final Examination - Fall 2015 EE 4601: Communication Systems Aids Allowed: 2 8 1/2 X11 crib sheets, calculator DATE: Tuesday

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Compression and Coding

Compression and Coding Compression and Coding Theory and Applications Part 1: Fundamentals Gloria Menegaz 1 Transmitter (Encoder) What is the problem? Receiver (Decoder) Transformation information unit Channel Ordering (significance)

More information

6.3 Bernoulli Trials Example Consider the following random experiments

6.3 Bernoulli Trials Example Consider the following random experiments 6.3 Bernoulli Trials Example 6.48. Consider the following random experiments (a) Flip a coin times. We are interested in the number of heads obtained. (b) Of all bits transmitted through a digital transmission

More information

Digital Circuits ECS 371

Digital Circuits ECS 371 Digital Circuits ECS 371 Dr. Prapun Suksompong prapun@siit.tu.ac.th Lecture 18 Office Hours: BKD 3601-7 Monday 9:00-10:30, 1:30-3:30 Tuesday 10:30-11:30 1 Announcement Reading Assignment: Chapter 7: 7-1,

More information

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

HW Solution 3 Due: Sep 11

HW Solution 3 Due: Sep 11 ECS 35: Probability and Random Processes 204/ HW Solution 3 Due: Sep Lecturer: Prapun Suksompong, PhD Instructions (a) ONE part of a question will be graded (5 pt) Of course, you do not know which part

More information

NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY DEPARTMENT OF ELECTRONICS AND TELECOMMUNICATIONS

NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY DEPARTMENT OF ELECTRONICS AND TELECOMMUNICATIONS page 1 of 5 (+ appendix) NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY DEPARTMENT OF ELECTRONICS AND TELECOMMUNICATIONS Contact during examination: Name: Magne H. Johnsen Tel.: 73 59 26 78/930 25 534

More information

E2.5 Signals & Linear Systems. Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & 2)

E2.5 Signals & Linear Systems. Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & 2) E.5 Signals & Linear Systems Tutorial Sheet 1 Introduction to Signals & Systems (Lectures 1 & ) 1. Sketch each of the following continuous-time signals, specify if the signal is periodic/non-periodic,

More information

a) Find the compact (i.e. smallest) basis set required to ensure sufficient statistics.

a) Find the compact (i.e. smallest) basis set required to ensure sufficient statistics. Digital Modulation and Coding Tutorial-1 1. Consider the signal set shown below in Fig.1 a) Find the compact (i.e. smallest) basis set required to ensure sufficient statistics. b) What is the minimum Euclidean

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Coding of memoryless sources 1/35

Coding of memoryless sources 1/35 Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems

More information

HW Solution 12 Due: Dec 2, 9:19 AM

HW Solution 12 Due: Dec 2, 9:19 AM ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether

More information

HW Solution 2 Due: July 10:39AM

HW Solution 2 Due: July 10:39AM ECS 35: Probability and Random Processes 200/ HW Solution 2 Due: July 9 @ 0:39AM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) A part of ONE question will be graded. Of course, you do not know which

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006 MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.44 Transmission of Information Spring 2006 Homework 2 Solution name username April 4, 2006 Reading: Chapter

More information

Lecture 7 September 24

Lecture 7 September 24 EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Weiyao Lin. Shanghai Jiao Tong University. Chapter 5: Digital Transmission through Baseband slchannels Textbook: Ch

Weiyao Lin. Shanghai Jiao Tong University. Chapter 5: Digital Transmission through Baseband slchannels Textbook: Ch Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 5: Digital Transmission through Baseband slchannels Textbook: Ch 10.1-10.5 2009/2010 Meixia Tao @ SJTU 1 Topics to be Covered

More information

Line Codes and Pulse Shaping Review. Intersymbol interference (ISI) Pulse shaping to reduce ISI Embracing ISI

Line Codes and Pulse Shaping Review. Intersymbol interference (ISI) Pulse shaping to reduce ISI Embracing ISI Line Codes and Pulse Shaping Review Line codes Pulse width and polarity Power spectral density Intersymbol interference (ISI) Pulse shaping to reduce ISI Embracing ISI Line Code Examples (review) on-off

More information

BINARY CODES. Binary Codes. Computer Mathematics I. Jiraporn Pooksook Department of Electrical and Computer Engineering Naresuan University

BINARY CODES. Binary Codes. Computer Mathematics I. Jiraporn Pooksook Department of Electrical and Computer Engineering Naresuan University Binary Codes Computer Mathematics I Jiraporn Pooksook Department of Electrical and Computer Engineering Naresuan University BINARY CODES: BCD Binary Coded Decimal system is represented by a group of 4

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 26 Linear Block Codes Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay July 28, 2014 Binary Block Codes 3 / 26 Let F 2 be the set

More information

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Example: Bipolar NRZ (non-return-to-zero) signaling

Example: Bipolar NRZ (non-return-to-zero) signaling Baseand Data Transmission Data are sent without using a carrier signal Example: Bipolar NRZ (non-return-to-zero signaling is represented y is represented y T A -A T : it duration is represented y BT. Passand

More information

COMM901 Source Coding and Compression. Quiz 1

COMM901 Source Coding and Compression. Quiz 1 German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Linear Codes and Syndrome Decoding

Linear Codes and Syndrome Decoding Linear Codes and Syndrome Decoding These notes are intended to be used as supplementary reading to Sections 6.7 9 of Grimaldi s Discrete and Combinatorial Mathematics. The proofs of the theorems are left

More information

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes. 5 Binary Codes You have already seen how check digits for bar codes (in Unit 3) and ISBN numbers (Unit 4) are used to detect errors. Here you will look at codes relevant for data transmission, for example,

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10 Digital Band-pass Modulation PROF. MICHAEL TSAI 211/11/1 Band-pass Signal Representation a t g t General form: 2πf c t + φ t g t = a t cos 2πf c t + φ t Envelope Phase Envelope is always non-negative,

More information

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding SIGNAL COMPRESSION Lecture 3 4.9.2007 Shannon-Fano-Elias Codes and Arithmetic Coding 1 Shannon-Fano-Elias Coding We discuss how to encode the symbols {a 1, a 2,..., a m }, knowing their probabilities,

More information

Lecture 5b: Line Codes

Lecture 5b: Line Codes Lecture 5b: Line Codes Dr. Mohammed Hawa Electrical Engineering Department University of Jordan EE421: Communications I Digitization Sampling (discrete analog signal). Quantization (quantized discrete

More information

Shannon's Theory of Communication

Shannon's Theory of Communication Shannon's Theory of Communication An operational introduction 5 September 2014, Introduction to Information Systems Giovanni Sileno g.sileno@uva.nl Leibniz Center for Law University of Amsterdam Fundamental

More information

1.2 Inductive Reasoning

1.2 Inductive Reasoning 1.2 Inductive Reasoning Goal Use inductive reasoning to make conjectures. Key Words conjecture inductive reasoning counterexample Scientists and mathematicians look for patterns and try to draw conclusions

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei COMPSCI 650 Applied Information Theory Apr 5, 2016 Lecture 18 Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei 1 Correcting Errors in Linear Codes Suppose someone is to send

More information

Signal Design for Band-Limited Channels

Signal Design for Band-Limited Channels Wireless Information Transmission System Lab. Signal Design for Band-Limited Channels Institute of Communications Engineering National Sun Yat-sen University Introduction We consider the problem of signal

More information

Basic information theory

Basic information theory Basic information theory Communication system performance is limited by Available signal power Background noise Bandwidth limits. Can we postulate an ideal system based on physical principles, against

More information

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak 4. Quantization and Data Compression ECE 32 Spring 22 Purdue University, School of ECE Prof. What is data compression? Reducing the file size without compromising the quality of the data stored in the

More information

Summary: ISI. No ISI condition in time. II Nyquist theorem. Ideal low pass filter. Raised cosine filters. TX filters

Summary: ISI. No ISI condition in time. II Nyquist theorem. Ideal low pass filter. Raised cosine filters. TX filters UORIAL ON DIGIAL MODULAIONS Part 7: Intersymbol interference [last modified: 200--23] Roberto Garello, Politecnico di orino Free download at: www.tlc.polito.it/garello (personal use only) Part 7: Intersymbol

More information

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1 EECS 121 Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1 PRINT your student ID: PRINT AND SIGN your name:, (last) (first) (signature) PRINT your Unix account login: ee121- Prob.

More information

Lecture 1. Introduction

Lecture 1. Introduction Lecture 1. Introduction What is the course about? Logistics Questionnaire Dr. Yao Xie, ECE587, Information Theory, Duke University What is information? Dr. Yao Xie, ECE587, Information Theory, Duke University

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

A Family of Nyquist Filters Based on Generalized Raised-Cosine Spectra

A Family of Nyquist Filters Based on Generalized Raised-Cosine Spectra Proc. Biennial Symp. Commun. (Kingston, Ont.), pp. 3-35, June 99 A Family of Nyquist Filters Based on Generalized Raised-Cosine Spectra Nader Sheikholeslami Peter Kabal Department of Electrical Engineering

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Problem Value

Problem Value GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL of ELECTRICAL & COMPUTER ENGINEERING FINAL EXAM DATE: 30-Apr-04 COURSE: ECE-2025 NAME: GT #: LAST, FIRST Recitation Section: Circle the date & time when your Recitation

More information

DEPARTMENT OF EECS MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 6.02: Digital Communication Systems, Fall Quiz I. October 11, 2012

DEPARTMENT OF EECS MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 6.02: Digital Communication Systems, Fall Quiz I. October 11, 2012 6.02 Fall 2012, Quiz 2 Page 1 of 12 Name: DEPARTMENT OF EECS MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.02: Digital Communication Systems, Fall 2012 Quiz I October 11, 2012 your section Section Time Recitation

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin, PhD Shanghai Jiao Tong University Chapter 4: Analog-to-Digital Conversion Textbook: 7.1 7.4 2010/2011 Meixia Tao @ SJTU 1 Outline Analog signal Sampling Quantization

More information

exercise in the previous class (1)

exercise in the previous class (1) exercise in the previous class () Consider an odd parity check code C whose codewords are (x,, x k, p) with p = x + +x k +. Is C a linear code? No. x =, x 2 =x =...=x k = p =, and... is a codeword x 2

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005 Chapter 7 Error Control Coding Mikael Olofsson 2005 We have seen in Chapters 4 through 6 how digital modulation can be used to control error probabilities. This gives us a digital channel that in each

More information

Machine Learning, Fall 2009: Midterm

Machine Learning, Fall 2009: Midterm 10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all

More information

HW Solution 7 Due: Oct 24

HW Solution 7 Due: Oct 24 ECS 315: Probability and Random Processes 2014/1 HW Solution 7 Due: Oct 24 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which

More information

BASICS OF DETECTION AND ESTIMATION THEORY

BASICS OF DETECTION AND ESTIMATION THEORY BASICS OF DETECTION AND ESTIMATION THEORY 83050E/158 In this chapter we discuss how the transmitted symbols are detected optimally from a noisy received signal (observation). Based on these results, optimal

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220 ECE 564/645 - Digital Communications, Spring 08 Midterm Exam # March nd, 7:00-9:00pm Marston 0 Overview The exam consists of four problems for 0 points (ECE 564) or 5 points (ECE 645). The points for each

More information

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 3 Discrete Memoryless Channel (DMC) Office Hours: BKD, 6th floor of Sirindhralai building Tuesday 4:20-5:20

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 6 December 2006 This examination consists of

More information