CS 798: Homework Assignment 2 (Probability)

Similar documents
CS-433: Simulation and Modeling Modeling and Probability Review

Probability and Random Variable Primer

Analysis of Discrete Time Queues (Section 4.6)

Expected Value and Variance

Applied Stochastic Processes

Randomness and Computation

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

6. Stochastic processes (2)

Engineering Risk Benefit Analysis

6. Stochastic processes (2)

Lecture 3: Probability Distributions

Convergence of random processes

Chapter 1. Probability

Simulation and Random Number Generation

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology

Homework Assignment 3 Due in class, Thursday October 15

Introduction to Continuous-Time Markov Chains and Queueing Theory

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Chapter 13: Multiple Regression

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

Hydrological statistics. Hydrological statistics and extremes

PROBABILITY PRIMER. Exercise Solutions

Lecture Notes on Linear Regression

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

TCOM 501: Networking Theory & Fundamentals. Lecture 7 February 25, 2003 Prof. Yannis A. Korilis

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

A be a probability space. A random vector

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

Conjugacy and the Exponential Family

Limited Dependent Variables

Lecture 6 More on Complete Randomized Block Design (RBD)

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Strong Markov property: Same assertion holds for stopping times τ.

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ

Probability Theory (revisited)

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

CS286r Assign One. Answer Key

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

Queueing Networks II Network Performance

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

Equilibrium Analysis of the M/G/1 Queue

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

MATH 5630: Discrete Time-Space Model Hung Phan, UMass Lowell March 1, 2018

Application of Queuing Theory to Waiting Time of Out-Patients in Hospitals.

Continuous Time Markov Chain

Homework 9 for BST 631: Statistical Theory I Problems, 11/02/2006

First Year Examination Department of Statistics, University of Florida

Negative Binomial Regression

PhysicsAndMathsTutor.com

EGR 544 Communication Theory

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

THE ROYAL STATISTICAL SOCIETY 2006 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE

Hidden Markov Models

Chapter 7 Channel Capacity and Coding

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

Lecture 4. Instructor: Haipeng Luo

Chapter 7 Channel Capacity and Coding

DS-GA 1002 Lecture notes 5 Fall Random processes

18.1 Introduction and Recap

Learning from Data 1 Naive Bayes

Lecture 4: November 17, Part 1 Single Buffer Management

Markov chains. Definition of a CTMC: [2, page 381] is a continuous time, discrete value random process such that for an infinitesimal

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

Lecture 5. ALOHAnet. ALOHA protocols. Client. Client. Hub. Client

2.3 Nilpotent endomorphisms

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table:

MATH 281A: Homework #6

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1.4. Experiments, Outcome, Sample Space, Events, and Random Variables

Distributions /06. G.Serazzi 05/06 Dimensionamento degli Impianti Informatici distrib - 1

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

Problem Set 9 - Solutions Due: April 27, 2005

Estimation: Part 2. Chapter GREG estimation

ECE559VV Project Report

Semi-Supervised Learning

First day August 1, Problems and Solutions

since [1-( 0+ 1x1i+ 2x2 i)] [ 0+ 1x1i+ assumed to be a reasonable approximation

3.1 ML and Empirical Distribution

Probability review. Adopted from notes of Andrew W. Moore and Eric Xing from CMU. Copyright Andrew W. Moore Slide 1

The EM Algorithm (Dempster, Laird, Rubin 1977) The missing data or incomplete data setting: ODL(φ;Y ) = [Y;φ] = [Y X,φ][X φ] = X

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

Lecture 3 January 31, 2017

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

Complex Numbers, Signals, and Circuits

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds.

The Expectation-Maximization Algorithm

Transcription:

0 Sample space Assgned: September 30, 2009 In the IEEE 802 protocol, the congeston wndow (CW) parameter s used as follows: ntally, a termnal wats for a random tme perod (called backoff) chosen n the range [, 2 CW ] before sendng a packet If an acknowledgement for the packet s not receved n tme, then CW s doubled, and the process s repeated, untl CW reaches the value CWMAX The ntal value of CW s CWMIN What s the sample space for (a) the value of CW? ( the value of the the backoff? Soluton: The sample space for CW s the dscrete set {CWMIN, 2* CWMIN, 4* CWMIN, 2 n *K*CWMIN}, where K s chosen so that 2 n *K*CWMIN < CWMAX The sample space for backoff, gven CW s a subset of the real lne defned by [0, CW] 20 Interpretatons of probablty Consder the statement: gven the condtons rght now, the probablty of a snowstorm tomorrow mornng s 25% How would you nterpret ths statement from the perspectve of an objectve, frequentst, and subjectve nterpretaton of probablty (assumng these are possble)? Soluton: An objectve nterpretaton would be that we have a complete weather model that has an ntrsc source of randomness Gven ths model and the current weather condtons, the model predcts that the probablty of a snowstorm s 25% A frequentst approach would be to look at all pror days where today s weather condtons also held, and look at the number of such days where there was a snowstorm the next mornng We would see that 25% of the tme, gven the current weather, there was as snowstorm A subjectve nterpretaton would be that an expert, who knew all the varables, would take 4: odds (or bettter) on a bet that t would snow tomorrow 30 Condtonal probablty Consder a devce that samples packets on a lnk (a) Suppose that measurements show that 20% of packets are UDP, and that 0% of all packets are UDP packets wth a packet sze of 00 byteswhat s the condtonal probablty that a UDP packet has sze 00 bytes? ( Suppose 50% of packets were UDP, and 50% of UDP packets were 00 bytes long What fracton of all packets are 00 byte UDP packets? Soluton: (a) We have P(UDP) = 02, and P(UDP AND 00) = 0 So, P(00 UDP) = 0/02 = 05 ( Here, P(UDP) = 05 and P(00 UDP) = 05 So, P(00 AND UDP) = 05*05 = 025 40 Condtonal probablty agan Contnung wth Ex 3: How does the knowledge of the protocol type change the sample space of possble packet lengths? In other words, what s the sample space before and after you know the protocol type of a packet? Soluton: Before you know the protocol type of a packet, the sample space s all possble packet lengths of all possble protocol types After you know the protocol type, the sample space only nclude packet lengths for that protocol Page of 5

50 Bayes rule For Exercse 3(a), what addtonal nformaton do you need to compute P(UDP 00)? Settng that value to x, express P(UDP 00) n terms of x Soluton: P(UDP 00) = (P(00 UDP)P(UDP))/P(00) We need P(00) = x Then, P(UDP 00) = 05*02/x = 0/x 60 Cumulatve dstrbuton functon (a) Suppose dscrete random varable D take values {, 2, 3,,,} wth probablty /2 What s ts CDF? ( Suppose contnuous random varable C s unform n the range [x, ] Whats s ts CDF? Soluton: (a) F D () = --- = -2 - ( f C(x) = --------------- x x, so F C (x) = = x --------------- dx --------------- x x 70 Expectatons Compute the expectatons of the D and C n Exercse 6 Soluton: (a) E[D] = --- ( By geometry, E[C] = ( +x )/2 (you can also derve ths analytcally) 80 Varance 2 j j = Prove that V[aX] = a 2 V[X] 2 j j = x x Soluton: V[aX] = E[a 2 X 2 ] - (E[aX]) 2 = a 2 (E[X 2 ] - (E[X]) 2 ) = a 2 V[X] 90 Bernoull dstrbuton A hotel has 20 guest rooms Assumng outgong calls are ndependent and that a guest room makes 0 mnutes worth of outgong calls durng the busest hour of the day, what s the probablty that 5 calls are smultaneously actve durng the busest hour? What s the probablty of 5 smultaneous calls? Soluton: Consder the event E defned as Room X s makng an outgong call durng the busy hour Clearly, P(E) =p = /6 The probablty of 5 smultaneous calls s 20 -- 5 and of 5 smultaneous calls s 20 5 6 5 -- 6 5 = 0322 -- 5 5 6 5 -- 6 5 = 4*0 8 Page 2 of 5

00 Geometrc dstrbuton Consder a lnk that has a packet loss rate of 0% Suppose that every packet transmsson has to be acknowledged Compute the expected number of data transmssons for a successful packet+ack transfer Soluton: Packet and ack transmssons are geometrcally dstrbuted wth parameter p=0 So the expected number of packet transmssons s /p = 0 and the expected number of ack transmssons s also 0 These are ndependent events, so the expected number of data transmssons for successful packet+ack transfer = 0+0 = 20 0 Posson dstrbuton Consder a bnomally dstrbuted random varable X wth parameters n=0, p=0 (a) Compute the value of P(X=8) usng both the bnomal dstrbuton and the Posson approxmaton ( Repeat for n=00, p=0 Soluton: (a) Usng the bnomal dstrbuton, the value s 0 8 ( 0 )( 09 2 ) = 36*0-6 For the Posson 8 approxmaton, λ=, so the value s PX ( = 8) = e ---- 8 = 89*0-6 ( Usng the bnomal dstrbuton, the 8! value s 00 8 ( 0 )( 09 92 ) = 4 For the Posson approxmaton, λ= 0, so the value s 8 PX ( = 8) = e 0 08 ------- = 2 It s clear that as n ncreases, the approxmaton greatly mproves 8! 20 Gaussan dstrbuton Prove that f X s Gaussan wth parameters ( μ, σ 2 ), then the random varable Y=aX + b, where a and b are constants, s also Gaussan, wth parameters( aμ + b, ( aσ) 2 ) Soluton: Consder the cumulatve dstrbuton of Y = F Y (y) = PY ( y) PaX ( + b y) P ( y X --------------- ( y = = = F a X --------------- f a > 0 a Then, f Y (y) = F Y (y) = F X ( --------------- y --f ( a a x --------------- y 2σ 2 2a a 2 σ 2 2a 2 σ 2 = = = = Comparng wth the standard defnton of a Gaussan, we see that the parameters of Y are ( aμ + b, ( aσ) 2 ) A smlar calculaton holds f a < 0 30 Exponental dstrbuton y b ---------- μ 2 a --------------------------------- ( y b aμ) 2 ------------------------------- ( y ( b+ aμ) ) 2 ------------------------------------ Suppose that customers arrve to a bank wth an exponentally dstrbuted nter-arrval tme wth mean 5 mnutes A customer walks nto the bank at 3pm What s the probablty that the next customer arrves no sooner than 3:5? Soluton: We have / λ =5 We need to compute -F(5) = ( e λx ) = e 5 = e 3 = 485 % 5 -------- Page 3 of 5

40 Exponental dstrbuton It s late August and you are watchng the Persed meteor shower You are told that that the tme between meteors s exponentally dstrbuted wth a mean of 200 seconds At 0:05 pm, you see a meteor, after whch you head to the ktchen for a bowl of cecream, returnng outsde at 0:08pm How long do you expect to wat to see the next meteor? Soluton: Because the exponental dstrbuton s memoryless, the expected watng tme s the same, e 200 seconds, no matter how long your break for cecream Isn t that nce? 50 Power law Consder a power-law dstrbuton wth x mn = and n the followng table: α = 2 and an exponental dstrbuton wth λ = 2 Fll x f power_law (x) f exponental (x) 5 0 50 00 Soluton: x f power_law (x) f exponental (x) 027 5 004 907*0-5 0 00 4*0-9 50 4*0-4 744*0-44 00 *0-4 276*0-87 It should now be obvous why a power-law dstrbuton s called heavy-taled! 60 Markov s nequalty Consder a random varable X that exponentally dstrbuted wth parameter λ = 2 What s the probablty that X > 0 usng (a) the exponental dstrbuton ( Markov s nequalty Soluton: (a) We need -F(0) = e -20 = 206*0-9 ( The mean of ths dstrbuton s /2 So, 05 PX ( 0) ------ = 005 It s clear that the bound s very loose 0 70 Jont probablty dstrbuton Consder the followng probablty mass functon defned jontly over the random varables, X, Y, and Z: Page 4 of 5

P(000) = 005; P(00) = 005; P(00) = 0; P(0)=03;P(00) = 005; P(0) = 005; P(0) = 0; P()=03 (a) Wrte down p X, p Y,p z,p XY,p XZ,p YZ ( Are X and Y, X and Z, or Y and Z ndependent? What s the probablty that X=0 gven that Z=? Soluton: (a) p X = {05, 05}; p Y = {02, 08}; p Z = {03, 07}; p XY = {0, 04, 0, 04}; p XZ = {05, 035, 05, 035}; p YZ = {0, 0, 02, 06} ( X and Y are ndependent because p XY = p X p Y X and Z are ndependent because p XZ = p X p Z (c) P(X=0 Z=) = P(X=0 AND Z=)/P(Z=) = 035/07 = 05 Page 5 of 5