ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5

Similar documents
Digital Communication Systems ECS 452

Digital Communication Systems ECS 452

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

HW Solution 3 Due: July 15

HW 13 Due: Dec 6, 5 PM

Homework Set #2 Data Compression, Huffman code and AEP

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Capacity of a channel Shannon s second theorem. Information Theory 1/33

4 An Introduction to Channel Coding and Decoding over BSC

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

National University of Singapore Department of Electrical & Computer Engineering. Examination for

HW Solution 12 Due: Dec 2, 9:19 AM

Motivation for Arithmetic Coding

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding

1 Introduction to information theory

3F1 Information Theory, Lecture 3

ECE Information theory Final

Lecture 1: Shannon s Theorem

CSCI 2570 Introduction to Nanocomputing

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Noisy channel communication

2018/5/3. YU Xiangyu

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Communications Theory and Engineering

COMM901 Source Coding and Compression. Quiz 1

Exercises with solutions (Set B)

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

Solutions to Set #2 Data Compression, Huffman code and AEP

EE 229B ERROR CONTROL CODING Spring 2005

3F1 Information Theory, Lecture 3

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

Chapter 5: Data Compression

UNIT I INFORMATION THEORY. I k log 2

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Channel Coding I. Exercises SS 2017

U Logo Use Guidelines

Entropies & Information Theory

Entropy as a measure of surprise

Lecture 22: Final Review

Coding of memoryless sources 1/35

Exercise 1. = P(y a 1)P(a 1 )


10-704: Information Processing and Learning Fall Lecture 10: Oct 3

Chapter 2: Source coding

Math 512 Syllabus Spring 2017, LIU Post

Lecture Notes on Digital Transmission Source and Channel Coding. José Manuel Bioucas Dias

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Solutions to Homework Set #3 Channel and Source coding

Distributed Source Coding Using LDPC Codes

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

LECTURE 10. Last time: Lecture outline

Lecture 15: Conditional and Joint Typicaility

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Lecture 14: Hamming and Hadamard Codes

Intro to Information Theory

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

BASIC COMPRESSION TECHNIQUES

Lecture 1. Introduction

Lecture 8: Shannon s Noise Models

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Information Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18

Lecture 1 : Data Compression and Entropy

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Coding for Discrete Source

10-704: Information Processing and Learning Fall Lecture 9: Sept 28

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

Lecture 4 Noisy Channel Coding

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

Lecture 4 : Adaptive source coding algorithms

Channel Coding I. Exercises SS 2017

An introduction to basic information theory. Hampus Wessman

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Huffman Coding. C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University

Lecture B04 : Linear codes and singleton bound

Linear Codes and Syndrome Decoding

HW Solution 2 Due: July 10:39AM

Lecture 3 : Algorithms for source coding. September 30, 2016

Answers and Solutions to (Even Numbered) Suggested Exercises in Sections of Grimaldi s Discrete and Combinatorial Mathematics

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Solutions to problems from Chapter 3

X 1 : X Table 1: Y = X X 2

DCSP-3: Minimal Length Coding. Jianfeng Feng

Lecture 1: September 25, A quick reminder about random variables and convexity

Electrical Engineering Written PhD Qualifier Exam Spring 2014

Mathematics Department

Transcription:

ECS 452: Digital Communication Systems 2015/2 HW 1 Due: Feb 5 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Must solve all non-optional problems. (5 pt) (i) Write your first name and the last three digit of your student ID on the upper-right corner of every submitted page. (ii) For each part, write your explanation/derivation and answer in the space provided. (b) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (c) Late submission will be rejected. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1 (HW1-2015-2). Consider the code {0, 01} (a) Is it nonsingular? (b) Is it uniquely decodable? (c) Is it prefix-free? 1-1

ECS 452 HW 1 Due: Feb 5 2015/2 Problem 2 (HW1-2015-2). Consider the random variable X whose support S X seven values: S X = {x 1, x 2,..., x 7 }. Their corresponding probabilities are given by contains (a) Find the entropy H(X). x x 1 x 2 x 3 x 4 x 5 x 6 x 7 p X (x) 0.49 0.26 0.12 0.04 0.04 0.03 0.02 (b) Find a binary Huffman code for X. (c) Find the expected codelength for the encoding in part (b). 1-2

ECS 452 HW 1 Due: Feb 5 2015/2 Problem 3 (HW1-2015-2). Find the entropy and the binary Huffman code for the random variable X with pmf { x, x = 1, 2,..., 6, p X (x) = 21 0, otherwise. Also calculate E [l(x)] when Huffman code is used. Problem 4 (HW1-2015-2). These codes cannot be Huffman codes. Why? (a) {00, 01, 10, 110} (b) {01, 10} (c) {0, 01} 1-3

ECS 452 HW 1 Due: Feb 5 2015/2 Problem 5 (HW1-2015-2). A memoryless source emits two possible message Y(es) and N(o) with probability 0.9 and 0.1, respectively. (a) Determine the entropy (per source symbol) of this source. (b) Find the expected codeword length per symbol of the Huffman binary code for the third-order extensions of this source. (c) Use MATLAB to find the expected codeword length per (source) symbol of the Huffman binary code for the fourth-order extensions of this source. (i) Put your answer here. (ii) Don t forget to attach the printout of your MALTAB script (highlighting the modified parts if you start from the provided class example) and the expressions/results displayed in the command window. (d) Use MATLAB to plot the expected codeword length per (source) symbol of the Huffman binary code for the nth-order extensions of this source for n = 1, 2,..., 8. Attach the printout of your plot. 1-4

ECS 452 HW 1 Due: Feb 5 2015/2 Problem 6 (HW1-2015-2). (Optional) The following claim is sometimes found in the literature: It can be shown that the length l(x) of the Huffman code of a symbol x with probability p X (x) is always less than or equal to log 2 p X (x). Even though it is correct in many cases, this claim is not true in general. Find an example where the length l(x) of the Huffman code of a symbol x is greater than log 2 p X (x). Hint: Consider a pmf that has the following four probability values {0.01, 0.30, 0.34, 0.35}. Problem 7 (HW1-2015-2). (Optional) Construct a random variable X (by specifying its pmf) whose corresponding Huffman code is {0, 10, 11}. 1-5

ECS 452: Digital Communication Systems 2015/2 HW 2 Due: Feb 19 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Must solve all non-optional problems. (5 pt) (i) Write your first name and the last three digit of your student ID on the upper-right corner of every submitted page. (ii) For each part, write your explanation/derivation and answer in the space provided. (b) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (c) Late submission will be rejected. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. In this question, each output string from a DMS is encoded by the following source code: (a) Is the code prefix-free? x Codeword c(x) a 1 d 01 e 0000 i 001 o 00010 u 00011 (b) Is the code uniquely decodable? 2-1

ECS 452 HW 2 Due: Feb 19 2015/2 (c) Suppose the DMS produces the string audio. Find the output of the source encoder. (d) Suppose the output of the source encoder is 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 Find the corresponding source string produced by the DMS. Use / to indicate the locations where the sting above is split into codewords. Problem 2. A DMC has X = {0, 1} and Y = {1, 2, 3}. The following decoding table is used to decode the channel output. y ˆx(y) 1 0 2 1 3 0 Suppose the channel output string is 212213221133122122132. (a) Find the corresponding decoded string. (b) Suppose the channel input string is produced from an ASCII source encoder by the command dec2bin(sourcestring,7) in MATLAB. Assume that there is no channel decoding error. Find the corresponding source string. Problem 3 (HW2-2015-2). Consider a BSC whose crossover probability for each bit is p = 0.35. Suppose P [X = 0] = 0.45. (a) Draw the channel diagram. 2-2

ECS 452 HW 2 Due: Feb 19 2015/2 (b) Find the channel matrix Q. (c) Find the joint pmf matrix P. (d) Find the row vector q which contains the pmf of the channel output Y. (e) We now analyze the performance of all four reasonable detectors for this binary channel. Complete the table below: ˆx(y) y 1 y 1 0 [ ] P ˆX = 0 X = 0 [ ] P ˆX = 1 X = 1 (f) Find the MAP detector and its error probability. P (C) P (E) (g) Find the ML detector and its error probability. 2-3

ECS 452 HW 2 Due: Feb 19 2015/2 Problem 4 (HW2-2015-2). Consider a DMC whose X = {1, 2, 3}, Y = {1, 2, 3}, and Q = 0.5 0.2 0.3 0.3 0.4 0.3. Suppose the input probability vector is p = [0.2, 0.4, 0.4]. 0.2 0.2 0.6 (a) Find the joint pmf matrix P. (b) Find the row vector q which contains the pmf of the channel output Y. (c) Find the error probability of the naive decoder. (d) Find the error probability of the (DIY) decoder ˆx(y) = 4 y. 2-4

ECS 452 HW 2 Due: Feb 19 2015/2 (e) Find the MAP detector and its error probability. (f) Find the ML detector and its error probability. Problem 5 (HW2-2015-2). Consider a BAC whose Q(1 0) = 0.35 and Q(0 1) = 0.55. Suppose P [X = 0] = 0.4. (a) Draw the channel diagram. (b) Find the joint pmf matrix P. 2-5

ECS 452 HW 2 Due: Feb 19 2015/2 (c) Find the row vector q which contains the pmf of the channel output Y. (d) We now analyze the performance of all four reasonable detectors for this binary channel. Complete the table below: ˆx(y) y 1 y 1 0 [ ] P ˆX = 0 X = 0 [ ] P ˆX = 1 X = 1 P (C) P (E) 2-6

ECS 452 HW 2 Due: Feb 19 2015/2 (e) Find the MAP detector and its error probability. (f) Find the ML detector and its error probability. Problem 6 (HW2-2015-2). (Optional) Consider a DMC whose samples of input X and output Y are recorded as row vectors x and y in the file HW DMC Channel Data.mat. Write MATLAB script which uses the recorded information to estimate the quantities below. Note that most of these can be solved by appropriate parts of the m-files posted on the course web site. (a) The support X of X. (b) The support Y of Y. (c) The row vector p which contains the pmf of X. (d) The Q matrix. (e) The row vector q which contains the pmf of Y. Do this using two methods: (i) Count directly from the observed values of Y. (ii) Use the estimated values of p and Q. 2-7

ECS 452 HW 2 Due: Feb 19 2015/2 (f) The error probability when the naive decoder is used. Do this using two methods: (i) Directly construct ˆx from y. Then, compare ˆx and x. (ii) Use the estimated values of p and Q. (g) The error probability when the MAP decoder is used. Do this using two methods: (i) First find the MAP decoder table using the estimated values of p and Q. Then, construct ˆx from y according to the decoder table. Finally, compare ˆx and x. (ii) Use the estimated values of p and Q to directly calculate the error probability. (h) The error probability when the ML decoder is used. Do this using two methods: (i) First find the ML decoder table using the estimated value of Q. Then, construct ˆx from y according to the decoder table. Finally, compare ˆx and x. (ii) Use the estimated values of p and Q to directly calculate the error probability. 2-8

ECS452_HW_2015_2_Sol Page 1 Q6 MATLAB Simulation of the system in Q4 Saturday, September 13, 2014 % MATLAB Script for Q6 of HW2 for ECS 452 % By Asst. Prof. Dr. Prapun Suksompong. close all; clear all; tic 11:44 PM load HW_DMC_Channel_Data (a) ---------------------------------------------------------------- S_X = unique(x) (b) ---------------------------------------------------------------- S_Y = unique(y) %% Statistical Analysis n = length(x); (c) ---------------------------------------------------------------- % The probability values for the channel inputs p_x_sim = hist(x,s_x)/n % Relative frequencies from the simulation (d) ---------------------------------------------------------------- % The channel transition probabilities from the simulation Q_sim = []; for k = 1:length(S_X) I = find(x==s_x(k)); LI = length(i); rel_freq_xk = LI/n; yc = y(i); cond_rel_freq = hist(yc,s_y)/li; Q_sim = [Q_sim; cond_rel_freq]; end Q_sim % Relative frequencies from the simulation (e) ---------------------------------------------------------------- (e.i) -------------------------------------------------------------- p_y_sim = hist(y,s_y)/n % Relative frequencies from the simulation (e.ii) ------------------------------------------------------------- p_y_sim2 = p_x_sim*q_sim %% p_x = p_x_sim; Q = Q_sim; (f) ---------------------------------------------------------------- %% Naive Decoder x_hat = y;

(f.i) -------------------------------------------------------------- % Error Probability PE_sim_Naive = 1-sum(x==x_hat)/n % Error probability from the simulation (f.ii) ------------------------------------------------------------- % Calculation of the theoretical error probability PC = 0; for k = 1:length(S_X) t = S_X(k); i = find(s_y == t); if length(i) == 1 PC = PC+ p_x(k)*q(k,i); end end PE_theoretical_Naive = 1-PC (g) ---------------------------------------------------------------- %% MAP Decoder P = diag(p_x)*q; % Weight the channel transition probability by the % corresponding prior probability. [V I] = max(p); % For I, the default MATLAB behavior is %that when there are multiple max, the % index of the first one is returned. Decoder_Table_MAP = S_X(I) % The decoded values corresponding to the % received Y (g.i) -------------------------------------------------------------- % Decode according to the decoder table x_hat = y; % preallocation for k = 1:length(S_Y) I = (y==s_y(k)); x_hat(i) = Decoder_Table_MAP(k); end PE_sim_MAP = 1-sum(x==x_hat)/n % Error probability from the simulation (g.ii) ------------------------------------------------------------- % Calculation of the theoretical error probability Decoder_Table = Decoder_Table_MAP; PC = 0; for k = 1:length(S_X) I = (Decoder_Table == S_X(k)); Q_row = Q(k,:); PC = PC+ p_x(k)*sum(q_row(i)); end PE_theoretical_MAP = 1-PC (h) ---------------------------------------------------------------- %% ML Decoder [V I] = max(q); % For I, the default MATLAB behavior is %that when there are multiple max, the % index of the first one is returned. Decoder_Table_ML = S_X(I) % The decoded values corresponding to the ECS452_HW_2015_2_Sol Page 2

ECS452_HW_2015_2_Sol Page 3 % received Y % Decode according to the decoder table x_hat = y; % preallocation for k = 1:length(S_Y) I = (y==s_y(k)); x_hat(i) = Decoder_Table_ML(k); end (h.i) -------------------------------------------------------------- PE_sim_ML = 1-sum(x==x_hat)/n % Error probability from the simulation (h.ii) ------------------------------------------------------------- % Calculation of the theoretical error probability Decoder_Table = Decoder_Table_ML; PC = 0; for k = 1:length(S_X) I = (Decoder_Table == S_X(k)); Q_row = Q(k,:); PC = PC+ p_x(k)*sum(q_row(i)); end PE_theoretical_ML = 1-PC toc Results in the command window: >> HW_DMC_Channel_Estimation_2 S_X = 1 2 3 S_Y = 1 2 3 p_x_sim = 0.2004 0.4002 0.3994 Q_sim = 0.4988 0.2012 0.3000 0.3002 0.4000 0.2998 0.2000 0.2003 0.5997 p_y_sim = 0.3000 0.2804 0.4196 p_y_sim2 = 0.3000 0.2804 0.4196 PE_sim_Naive = 0.5004 PE_theoretical_Naive = 0.5004 Decoder_Table_MAP = 2 2 3 PE_sim_MAP = 0.4802 PE_theoretical_MAP = 0.4802

0.4802 PE_theoretical_MAP = 0.4802 Decoder_Table_ML = 1 2 3 PE_sim_ML = 0.5004 PE_theoretical_ML = 0.5004 Elapsed time is 0.410464 seconds. ECS452_HW_2015_2_Sol Page 4

ECS 452: Digital Communication Systems 2015/2 HW 3 Due: Feb 26 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Solve all non-optional problems. (5 pt) (i) Write your first name and the last three digit of your student ID on the upper-right corner of every submitted page. (ii) For each part, write your explanation/derivation and answer in the space provided. (b) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (c) Late submission will be rejected. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1 (HW3-2015-2). Consider a repetition code with a code rate of 1/5. Assume that the code is used with a BSC with a crossover probability p = 0.4. (a) Find the ML detector and its error probability. 3-1

ECS 452 HW 3 Due: Feb 26 2015/2 (b) Suppose the info-bit S is generated with P [S = 0] = 1 P [S = 1] = 0.4. Find the MAP detector and its error probability. (c) Assume the info-bit S is generated with Suppose the receiver observes 01001. P [S = 0] = 1 P [S = 1] = 0.45. (i) What is the probability that 0 was transmitted? (Do not forget that this is a conditional probability. The answer is not 0.45 because we have some extra information from the observed bits at the receiver.) 3-2

ECS 452 HW 3 Due: Feb 26 2015/2 (ii) What is the probability that 1 was transmitted? (iii) Given the observed 01001, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? Does your answer agree with the majority voting rule for decoding? (d) Assume that the source produces source bit S with P [S = 0] = 1 P [S = 1] = p 0. Suppose the receiver observes 01001. (i) What is the probability that 0 was transmitted? (ii) What is the probability that 1 was transmitted? 3-3

ECS 452 HW 3 Due: Feb 26 2015/2 (iii) Given the observed 01001, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? Your answer may depend on the value of p 0. Does your answer agree with the majority voting rule for decoding? Problem 2 (HW3-2015-2). A channel encoder map blocks of two bits to five-bit (channel) codewords. The four possible codewords are 00000, 01000, 10001, and 11111. A codeword is transmitted over the BSC with crossover probability p = 0.1. (a) What is the minimum (Hamming) distance d min among the codewords? (b) Suppose the codeword x = 10001 was transmitted. What is the probability that the receiver observes y = 01001 at the output of the BSC. 3-4

ECS 452 HW 3 Due: Feb 26 2015/2 (c) Suppose the receiver observes 01001 at the output of the BSC. (i) Assume that all four codewords are equally likely to be transmitted. Given the observed 01001 at the receiver, what is the most likely codeword that was transmitted? (ii) Assume that the four codewords are not equally likely. Suppose 11111 is transmitted more frequently with probability 0.7. The other three codewords are transmitted with probability 0.1 each. Given the observed 01001 at the receiver, what is the most likely codeword that was transmitted? 3-5

ECS 452 HW 3 Due: Feb 26 2015/2 Problem 3 (HW3-2015-2, Optional). Optimal code lengths that require one bit above entropy: The source coding theorem says that the Huffman code for a random variable X has an expected length strictly less than H(X) + 1. Give an example of a random variable for which the expected length of the Huffman code (without any source extension) is very close to H(X) + 1. 3-6

ECS 452: Digital Communication Systems 2015/2 HW 4 Due: Not Due Lecturer: Asst. Prof. Dr. Prapun Suksompong Problem 1 (HW4-2015-2, Free). In each row of the table below, compare the entropy H(X) of the random variable X in the first column with the entropy H(Y ) of the random variable Y in the third column by writing >, =, or < in the second column. Watch out for approximation and round-off error. H(X) when p = [0.3, 0.7]. H(Y ) when q = [0.8, 0.2]. H(X) when p = [0.3, 0.3, 0.4]. H(Y ) when q = [0.4, 0.3, 0.3]. 0.3, x {1, 2}, H(X) when p (x) = 0.2, x {3, 4}, H(Y ) when q = [0.4, 0.3, 0.3]. 0, otherwise. Problem 2 (HW4-2015-2, Free). Consider random variables X and Y whose joint pmf is given by { c (x + y), x {1, 3} and y {2, 4}, p X,Y (x, y) = 0, otherwise. Evaluate the following quantities. (a) c (b) H(X, Y ) (c) H(X) (d) H(Y ) (e) H(X Y ) (f) H(Y X) (g) I(X; Y ) 4-1

ECS 452 HW 4 Due: Not Due 2015/2 Problem 3 (HW4-2015-2, Free). Consider a pair of random variables X and Y whose joint pmf is given by 1/15, x = 3, y = 1, 2/15, x = 4, y = 1, p X,Y (x, y) = 4/15, x = 3, y = 3, β, x = 4, y = 3, 0, otherwise. (a) Find the value of the constant β. (b) Are X and Y independent? (c) Evaluate the following quantities. (i) H(X) (ii) H(Y ) (iii) H(X, Y ) (iv) H(X Y ) (v) H(Y X) (vi) I(X; Y ) 4-2