ELT COMMUNICATION THEORY

Similar documents
, which yields. where z1. and z2

ENSC Discrete Time Systems. Project Outline. Semester

AP Statistics Notes Unit Two: The Normal Distributions

ELE Final Exam - Dec. 2018

CHAPTER 24: INFERENCE IN REGRESSION. Chapter 24: Make inferences about the population from which the sample data came.

Distributions, spatial statistics and a Bayesian perspective

CHM112 Lab Graphing with Excel Grading Rubric

Experiment #3. Graphing with Excel

Source Coding and Compression

making triangle (ie same reference angle) ). This is a standard form that will allow us all to have the X= y=

Activity Guide Loops and Random Numbers

SPH3U1 Lesson 06 Kinematics

Probability, Random Variables, and Processes. Probability

4th Indian Institute of Astrophysics - PennState Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur. Correlation and Regression

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System

COMP 551 Applied Machine Learning Lecture 5: Generative models for linear classification

LCAO APPROXIMATIONS OF ORGANIC Pi MO SYSTEMS The allyl system (cation, anion or radical).

Modelling of Clock Behaviour. Don Percival. Applied Physics Laboratory University of Washington Seattle, Washington, USA

Flipping Physics Lecture Notes: Simple Harmonic Motion Introduction via a Horizontal Mass-Spring System

Perfrmance f Sensitizing Rules n Shewhart Cntrl Charts with Autcrrelated Data Key Wrds: Autregressive, Mving Average, Runs Tests, Shewhart Cntrl Chart

Department of Electrical Engineering, University of Waterloo. Introduction

Physics 2010 Motion with Constant Acceleration Experiment 1

Pattern Recognition 2014 Support Vector Machines

Bootstrap Method > # Purpose: understand how bootstrap method works > obs=c(11.96, 5.03, 67.40, 16.07, 31.50, 7.73, 11.10, 22.38) > n=length(obs) >

CS 477/677 Analysis of Algorithms Fall 2007 Dr. George Bebis Course Project Due Date: 11/29/2007

Review Problems 3. Four FIR Filter Types

Part a: Writing the nodal equations and solving for v o gives the magnitude and phase response: tan ( 0.25 )

Tree Structured Classifier

Thermodynamics and Equilibrium

Chapter 3 Kinematics in Two Dimensions; Vectors

MATHEMATICS SYLLABUS SECONDARY 5th YEAR

Part 3 Introduction to statistical classification techniques

Lab 1 The Scientific Method

I. Analytical Potential and Field of a Uniform Rod. V E d. The definition of electric potential difference is

Lecture 10, Principal Component Analysis

Lecture 13: Markov Chain Monte Carlo. Gibbs sampling

Checking the resolved resonance region in EXFOR database

The Law of Total Probability, Bayes Rule, and Random Variables (Oh My!)

Interference is when two (or more) sets of waves meet and combine to produce a new pattern.

Chemistry 20 Lesson 11 Electronegativity, Polarity and Shapes

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

Maximum A Posteriori (MAP) CS 109 Lecture 22 May 16th, 2016

Three charges, all with a charge of 10 C are situated as shown (each grid line is separated by 1 meter).

Synchronous Motor V-Curves

Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff

x 1 Outline IAML: Logistic Regression Decision Boundaries Example Data

Computational modeling techniques

Weathering. Title: Chemical and Mechanical Weathering. Grade Level: Subject/Content: Earth and Space Science

Kinetic Model Completeness

Chapter 8: The Binomial and Geometric Distributions

Internal vs. external validity. External validity. This section is based on Stock and Watson s Chapter 9.

Least Squares Optimal Filtering with Multirate Observations

Medium Scale Integrated (MSI) devices [Sections 2.9 and 2.10]

Hypothesis Tests for One Population Mean

COMP 551 Applied Machine Learning Lecture 4: Linear classification

ELEG 635 Digital Communication Theory. Lecture 9

Lyapunov Stability Stability of Equilibrium Points

Chapter 3: Cluster Analysis

/ / Chemistry. Chapter 1 Chemical Foundations

Dead-beat controller design

Ecology 302 Lecture III. Exponential Growth (Gotelli, Chapter 1; Ricklefs, Chapter 11, pp )

Physics 212. Lecture 12. Today's Concept: Magnetic Force on moving charges. Physics 212 Lecture 12, Slide 1

Introduction to Smith Charts

Differentiation Applications 1: Related Rates

MATCHING TECHNIQUES. Technical Track Session VI. Emanuela Galasso. The World Bank

Admin. MDP Search Trees. Optimal Quantities. Reinforcement Learning

COMP 551 Applied Machine Learning Lecture 9: Support Vector Machines (cont d)

Spring 2010 Instructor: Michele Merler.

Comparing Several Means: ANOVA. Group Means and Grand Mean

This section is primarily focused on tools to aid us in finding roots/zeros/ -intercepts of polynomials. Essentially, our focus turns to solving.

Chapter 2 GAUSS LAW Recommended Problems:

and the Doppler frequency rate f R , can be related to the coefficients of this polynomial. The relationships are:

PSU GISPOPSCI June 2011 Ordinary Least Squares & Spatial Linear Regression in GeoDa

BASIC DIRECT-CURRENT MEASUREMENTS

Name AP CHEM / / Chapter 1 Chemical Foundations

Professional Development. Implementing the NGSS: High School Physics

Purpose: Use this reference guide to effectively communicate the new process customers will use for creating a TWC ID. Mobile Manager Call History

Fall 2013 Physics 172 Recitation 3 Momentum and Springs

COMP 551 Applied Machine Learning Lecture 11: Support Vector Machines

20 Faraday s Law and Maxwell s Extension to Ampere s Law

BLAST / HIDDEN MARKOV MODELS

Resampling Methods. Chapter 5. Chapter 5 1 / 52

MATCHING TECHNIQUES Technical Track Session VI Céline Ferré The World Bank

1b) =.215 1c).080/.215 =.372

Five Whys How To Do It Better

Data Analysis, Statistics, Machine Learning

MODULE 1. e x + c. [You can t separate a demominator, but you can divide a single denominator into each numerator term] a + b a(a + b)+1 = a + b

Homology groups of disks with holes

Module 3: Gaussian Process Parameter Estimation, Prediction Uncertainty, and Diagnostics

Lecture 5: Equilibrium and Oscillations

Lab #3: Pendulum Period and Proportionalities

SUMMER REV: Half-Life DUE DATE: JULY 2 nd

Math 105: Review for Exam I - Solutions

Pipetting 101 Developed by BSU CityLab

Revised 2/07. Projectile Motion

Simple Linear Regression (single variable)

Finite Automata. Human-aware Robo.cs. 2017/08/22 Chapter 1.1 in Sipser

GENESIS Structural Optimization for ANSYS Mechanical

Relationships Between Frequency, Capacitance, Inductance and Reactance.

Equilibrium of Stress

Transcription:

ELT 41307 COMMUNICATION THEORY Matlab Exercise #2 Randm variables and randm prcesses 1 RANDOM VARIABLES 1.1 ROLLING A FAIR 6 FACED DICE (DISCRETE VALIABLE) Generate randm samples fr rlling a fair 6 faced dice fr N times (i.e. there s an equal prbability fr each value f the dice). First, rll the dice fr N=10 times (help randi) N_faces = 6; %Number f faces in the dice N_trials = 10; %Number f trials (hw many times the dice is rlled) trials = randi(n_faces,1,n_trials); % Getting randm integers 1...6 Plt the histgram f the utcme values (help histgram, help hist, help bar, help histcunts) histgram functin was intrduced in MATLAB R2014b x_histgram_centers = 1:6; %x-axis fr the bin center crdinates % Befre versin R2014b ------------------------------------------- hist_cunt = hist(trials,x_histgram_centers); bar(x_histgram_centers,hist_cunt) xlabel('rlled number') ylabel('ccurrence number') grid n % Histgrams can be pltted directly as "hist(trials,x_histgram_centers)", % but the pltting parameters are mre difficult t handle. % Versin R2014b r after------------------------------------------- % x-axis fr the bin edge crdinates (last edge is fr the right edge) histgram_edges = 0.5:1:6.5; % Belw the ~ sign means that we discard that specific utput argument [hist_cunt, ~]= histcunts(trials,histgram_edges); bar(x_histgram_centers,hist_cunt) xlabel('rlled number') ylabel('ccurrence number')

grid n % Set a grid in the pltted title('dice rlling histgram') % Histgrams can pltted directly as "h = histgram(trials,'binlimits',[0.5 % 6.5],'BinMethd','integers')", where h is the histgram bject. Nrmalize the histgram values with N_trials (number f trials) s that we get the experimental prbability density functin (pdf) fr the dice. Hence, after the nrmalizatin the sum f the histgram bin area ( integral ) is equal t ne, as required with pdfs (actually when talking abut discrete pdf, we ften refer t pmf (prbability mass functin)). pdf_experimental = hist_cunt/n_trials; Plt the nrmalized histgram, and define the true (analytic) pdf fr the fair dice and plt it n tp f the experimental pdf (help hld) bar(x_histgram_centers,pdf_experimental) xlabel('rlled number') ylabel('pdf') grid n title('dice rlling nrmalized histgram') ne_face_prbability = 1/N_faces; % prbability f ne face f the dice hld n % avids remving the previus plt % pltting true pdf (a line between tw pints) plt([0.5 6.5],[ne_face_prbability ne_face_prbability],'r') hld ff % Using legend we can name different data curves in the plt (in rder f % appearance) legend('experimental pdf','true pdf') Are the true pdf and the experimental pdf perfectly equal? Repeat the prcess fr a few f times and cmpare the utcmes: Is the experimental pdf varying between simulatins? Increase the number f trials t N=100000 and repeat the prcess fr a few times Is the experimental pdf varying between simulatins nw? 1.2 NORMAL/GAUSSIAN DISTRIBUTED RANDOM VARIABLE Generate 1000 samples fr a nrmally distributed randm variable X~(μ,σ 2 ), where the mean and variance are given as μ=3, and σ 2 =4 (help randn) N_samples = 1000; mu = 3; sigma = sqrt(4); % the variance is 4 (i.e. sigma^2=4) Calculate the fllwing statistics f the bserved samples Mean (r average, r expected value) (help mean) Standard deviatin (help std) Variance (help var) Plt the experimental pdf (i.e. the nrmalized histgram s that the sum f histgram bin area is equal t ne) Define the histgram bin center x crdinates as bin_centers = 7:bin_width:13, where the bin_width = 0.5 defines the width f ne bin bin_width = 0.5; %bin width (in the x-axis)

bin_centers = -7:bin_width:13; %x-axis fr the bin center crdinates % x-axis fr the bin edge crdinates (last edge is fr the right edge): % (Three dts cntinues the cmmand in the fllwing line) bin_edges = (bin_centers(1)- bin_width/2):bin_width:(bin_centers(end)+bin_width/2); % ~ means that we discard that utput argument [hist_cunt, ~]= histcunts(x,bin_edges); pdf_experimental = hist_cunt/sum(hist_cunt*bin_width); bar(bin_centers,pdf_experimental,1) % and s n with the titles, xlabels, Plt the true (analytic) pdf f X~(μ,σ 2 ) n tp f the experimental pdf pdf_true = 1/(sqrt(2*pi)*sigma)*exp(-(mu-bin_edges).^2/(2*sigma^2)); hld n plt(bin_edges,pdf_true,'r','linewidth',3) % defines a specific line width % and s n with the titles, xlabels, (remember the legend als) Repeat the experiment with different number f samples: N=100 and N=100000 See the difference in the fitting between the experimental pdf and the analytic pdf as the number f samples changes Based n the histgram with N=100000 samples, define the prbability P(X>5.25) Integrate (i.e. sum) histgram bins, whse x axis center values are larger than 5.25 Nte that actually 5.25 is exactly n edge between tw histgram bins b = 5.25; indices_with_bin_center_larger_than_b = bin_centers > b; cnsidered_bin_values = pdf_experimental(indices_with_bin_center_larger_than_b); %area f the cnsidered bins prbability_x_larger_than_b = sum(cnsidered_bin_values*bin_width) Check yur answer with the Q functin P(X>b)= Q((b-μ)/σ) (help qfunc) analytic_prbability = qfunc((b-mu)/sigma) 2 RANDOM PROCESSES 2.1 WHITE NOISE VS. COLORED NOISE Generate a zer mean white Gaussian nise signal with variance σ 2 =3 N = 10000; % Number f generated samples nise_var = 3; % Desired nise variance nise = sqrt(nise_var)*randn(1,n); % Nise signal generatin Plt the nise signal plt(nise) xlabel('sample index') ylabel('nise amplitude') title('white nise') xlim([0 100]) %define the x-axis limits

Plt the histgram f the nise signal t see that it is Gaussian distributed histgram(nise,40) xlabel('nise amplitude') ylabel('histgram cunt') title('white nise histgram') Implement a lw pass FIR filter (help firpm, recap frm the previus exercise) Use filter rder f 60 and a transitin band frm 0.1*Fs/2 t 0.2*Fs/2 N_filter = 60; %even number h = firpm(n_filter,[0 0.1 0.2 1],[1 1 0 0]); Plt the impulse respnse and amplitude (frequency) respnse f the filter Yu can define the amplitude respnse as a functin f nrmalized frequency (i.e. Fs/2 Fs/2 1 1) N_freq = length(nise); freq_vec_filter = -1:2/N_freq:(1-2/N_freq); %frequency vectr values nrmalized between -1 and 1,plt(freq_vec_filter,10*lg10(fftshift(abs(fft(h,N_freq))))) xlabel('nrmalized frequency (F_s/2=1)') ylabel('amplitude') title('amplitude respnse f the filter') Filter the nise signal using the abve filter (help filter) % Filter the nise signal: filtered_nise = filter(h,1,nise); Plt the white nise and filtered nise signals in the same and cmpare Remember the filter delay in rder t align the signals prperly in time filtered_nise = filtered_nise(n_filter/2+1:end); %remve the delay plt(nise(1:n_samples_plt)) hld n plt(filtered_nise(1:n_samples_plt),'r') legend('white nise','clred nise') xlabel('sample index') ylabel('nise amplitude') title('white nise and filtered (clred) nise') Plt the histgram f the filtered nise signal Hw it is distributed? (recap: What is the distributin f the utput signal f an LTI system, if the input signal is Gaussian distributed?) Cmpute and plt the autcrrelatin functin f the riginal white nise signal (help xcrr) [crr_fun, lags] = xcrr(nise); % we nrmalize the max-value t 1 and use stem-functin in rder t emphasize % the impulse-like nature f the utcme,stem(lags,crr_fun/max(crr_fun)) xlabel('\tau') % \tau gives the Greek tau-letter (\ wrks generally fr the Greek alphabet) ylabel('r(\tau)') title('autcrrelatin f white nise') xlim([-30 30]) What type f autcrrelatin functin shuld the white nise have?

Cmpute and plt the autcrrelatin functin f the filtered (i.e., clred) nise signal Use the abve statements, but with plt instead f stem : [crr_fun, lags] = xcrr(nise);,plt(lags,crr_fun/max(crr_fun)) %...and s n Cmpare this with the previusly pltted impulse respnse f the filter In the end, plt the pwer spectra (i.e., amplitude spectrum squared) f the tw nise signals in the same (use the decibel scale 20*lg10(abs(.))) nise_abs_spec = 20*lg10(abs(fft(nise(1:length(filtered_nise))))); filtered_nise_abs_spec = 20*lg10(abs(fft(filtered_nise))); %Define the frequency vectr values (nrmalized between -1 and 1): freq_vec = -1:2/length(nise_abs_spec):1-2/length(nise_abs_spec); plt(freq_vec,fftshift(nise_abs_spec)) hld n plt(freq_vec,fftshift(filtered_nise_abs_spec),'r') hld ff xlabel('nrmalized frequency (F_s/2=1)') ylabel('pwer [db]') title('nise spectra') legend('white nise','filtered (clured) nise') Ntice that bth signals are still Gaussian distributed (see the previus histgrams), but the pwer spectra are different. Try t remember what is the cnnectin between the crrelatin functin and the pwer spectral density functin? Make sure yu understand the difference between the fllwing cncepts: Gaussian nise and white nise I.e., here bth nise signals are Gaussian, but nly the ther ne is white 2.2 RANDOM WALK MODEL (EXAMPLE FROM THE CLASSROOM EXERCISES) Let s cnsider a randm sequence X[n], the s called randm walk mdel, as n X[ n] i W[ i] 1 where W[i] are binary i.i.d. (independent and identically distributed) randm variables with prbabilities P[W[i] = s] = p and P[W[i] = s] = 1 p Generate a randm prcess f 2000 samples and 5000 realizatins (=ensemble size) Use first p=0.5 and s=1, but write the cde s that yu can cnveniently test the prcess with ther values t N_samples = 2000; %Number f samples fr each realizatin N_ensemble = 5000; %Number f signal realizatins (i.e., the size f ensemble) %Step prbability and step size: p = 0.5; % P(Wi=s) = p, P(Wi=-s) = 1-p s = 1; %step length n = 1:N_samples; % vectr f samples indices % Generating matrix f randmly generated steps:

W = rand(n_ensemble,n_samples); % (i.e. unifrmly distributed randm values between 0 and 1) indices_with_psitive_s = W<p; % find ut steps ging "up" W(indices_with_psitive_s) = s; % Define steps fr ging "up" W(~indices_with_psitive_s) = -s; % Define steps fr ging "dwn" % The verall "randm walk" is achieved by taking the cumulative sum ver the % steps: X = cumsum(w,2); % (Ntice that nw each rw describes ne randm walk realizatin, s the sum % is taken ver the 2nd dimensin) Plt five example realizatins f the randm walk prcess (help fr) Use subplts inside ne (help subplt) fr ind = 1:5 subplt(5,1,ind) plt(n,x(ind,:)) xlabel('n') ylabel('x(n)') grid n title(['realizatin #' num2str(ind)]) %num2str cnverts a numerical value int a character value end % Here is a handy way t get a full screen % (therwise the might be t unclear): set(gcf,'units','nrmalized','uterpsitin',[0 0 1 1]) Calculate the ensemble mean and the ensemble variance f the prcess Plt the calculated ensemble mean E[X[n]] and variance E[(X[n]- E[X[n]]) 2 ] in the same with the theretical values given as Theretical mean: E[X[n]] = ns(2p-1) Theretical variance: E[(X[n]- E[X[n]]) 2 ] = np(2s) 2 (1-p) Ntice that the ensemble mean and variance are nw functins f the sequence index n ( =time ), s there will be a specific mean and variance fr each sequence index mean_thery = n*s*(2*p-1); % Theretical mean var_thery = n*(2*s)^2*p*(1-p); % Theretical variance mean_bserved = mean(x); % Empirical mean (i.e., what we bserve) var_bserved = var(x); % Empirical variance (i.e., what we bserve) plt(n,mean_bserved,'b','linewidth',3) hld n plt(n,mean_thery,'r:','linewidth',2) hld ff legend('bserved mean','theretical mean') ylim([-2 2]) % set the axis limits in y-directin nly xlabel('n') ylabel('mean') title('mean ver the sample index') % % and the same fr the variance

Re run the prcess with different values f p, s, and try different number f realizatins Due t memry issues, d nt try t many realizatins at the same time (stay belw 100000 realizatins and 2000 samples) What if the number f realizatins is very lw? Is the prcess statinary (in wide sense)? Why?/Why nt?