Delay tomography for large scale networks

Similar documents
MARKOV CHAIN AND HIDDEN MARKOV MODEL

EM and Structure Learning

A DIMENSION-REDUCTION METHOD FOR STOCHASTIC ANALYSIS SECOND-MOMENT ANALYSIS

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks

Maximum Likelihood Estimation

Nested case-control and case-cohort studies

Queueing Networks II Network Performance

Concepts for Wireless Ad Hoc

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory

A finite difference method for heat equation in the unbounded domain

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Stat 543 Exam 2 Spring 2016

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Composite Hypotheses testing

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Lecture 3: Shannon s Theorem

6. Stochastic processes (2)

6. Stochastic processes (2)

Strong Markov property: Same assertion holds for stopping times τ.

Stat 543 Exam 2 Spring 2016

Chapter 20 Duration Analysis

Conjugacy and the Exponential Family

Associative Memories

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

COXREG. Estimation (1)

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Analysis of Discrete Time Queues (Section 4.6)

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

PhysicsAndMathsTutor.com

Hidden Markov Models

Estimation: Part 2. Chapter GREG estimation

Engineering Risk Benefit Analysis

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

Convergence of random processes

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

DISTRIBUTED PROCESSING OVER ADAPTIVE NETWORKS. Cassio G. Lopes and Ali H. Sayed

First Year Examination Department of Statistics, University of Florida

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Continuous Time Markov Chain

Random Partitions of Samples

ON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA

Research Article Green s Theorem for Sign Data

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

Dynamic Systems on Graphs

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE

Artificial Intelligence Bayesian Networks

Lecture 3: Probability Distributions

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Image Classification Using EM And JE algorithms

Clock Synchronization in WSN: from Traditional Estimation Theory to Distributed Signal Processing

APPENDIX A Some Linear Algebra

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

CS 798: Homework Assignment 2 (Probability)

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING

On mutual information estimation for mixed-pair random variables

Optimization of JK Flip Flop Layout with Minimal Average Power of Consumption based on ACOR, Fuzzy-ACOR, GA, and Fuzzy-GA

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

GENERATIVE AND DISCRIMINATIVE CLASSIFIERS: NAIVE BAYES AND LOGISTIC REGRESSION. Machine Learning

Number of cases Number of factors Number of covariates Number of levels of factor i. Value of the dependent variable for case k

Multispectral Remote Sensing Image Classification Algorithm Based on Rough Set Theory

Fast parameter estimation in loss tomography for networks of general topology

ECONOMICS 351*-A Mid-Term Exam -- Fall Term 2000 Page 1 of 13 pages. QUEEN'S UNIVERSITY AT KINGSTON Department of Economics

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems

Multilayer Perceptron (MLP)

Changing Topology and Communication Delays

Why Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one)

Analysis of Non-binary Hybrid LDPC Codes

Singularity structures and impacts on parameter estimation in finite mixtures of distributions

NP-Completeness : Proofs

Neuro-Adaptive Design - I:

U-Pb Geochronology Practical: Background

An (almost) unbiased estimator for the S-Gini index

Limited Dependent Variables and Panel Data. Tibor Hanappi

Dirichlet s Theorem In Arithmetic Progressions

A be a probability space. A random vector

Genericity of Critical Types

Multigradient for Neural Networks for Equalizers 1

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

3. Stress-strain relationships of a composite layer

6 Supplementary Materials

Probability Theory (revisited)

A quantum-statistical-mechanical extension of Gaussian mixture model

Achieving Optimal Throughput Utility and Low Delay with CSMA-like Algorithms: A Virtual Multi-Channel Approach

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables

Markov chains. Definition of a CTMC: [2, page 381] is a continuous time, discrete value random process such that for an infinitesimal

General viscosity iterative method for a sequence of quasi-nonexpansive mappings

Transcription:

Deay tomography for arge scae networks MENG-FU SHIH ALFRED O. HERO III Communcatons and Sgna Processng Laboratory Eectrca Engneerng and Computer Scence Department Unversty of Mchgan, 30 Bea. Ave., Ann Arbor, MI 4809- URSI Genera Assemby 00

Network Mongorng and Dagnoss Deay, Packet Loss Rate, Traffc Type,... Probems wth drect measurement (rmon): Dagnoss unavaabe or dsabed at nterna nodes. Non-cooperatve nterna nodes. A nterna nodes must be synchronzed

Network Tomography Probem End-to-End Measurements Actve vs. Passve Method Actve Method: Send probe packets Passve Method: Montor exstng fows 3

Importance of Lnk Deay Statstcs Assessment and updatng of routng/fow contro QoS assurance, especay for vdeo/audo streamng Network upgrade/mantenance pannng Securty, e.g., dstrbuted Dena-of-Servce (DoS) attacks 4

Probem Formuaton: Genera Notatons Logca Tree T=(V,E) V : Nodes, E : Lnks L nks, R eaf nodes/probe paths root N M : # of packets sent from root to eaf : The set of nks n probe path. (, n) X : nth probe packet deay at nk aong path, ncudng queueng deay, retransmsson deay,......... R Y (, n) and possby propagaton deay. = M X (, n) : nth End-to-end probe deay aong path. 5

Probem Formuaton: Genera Assumptons Network Assumptons N) Network topoogy known. N) Probe paths (routng tabe) known. N3) Cooperatng edge nodes are synchronzed Statstca Assumptons S) Spata Independence { (, n) X } For a gven packet aong path, mutuay ndependent. S) Tempora Independence and Statonarty M (, n) ( k X X If path and both contan nk, and jn, )..d. 6

Dscrete Deay Mode Lnk deays are dscretzed wth bn sze q (, n) Lnk deay vaues X { 0, q, q, L, qd} Lnk Deay P.M.F., d Lemma. ( (, n) ) p = P X = d The deay p.m.f. wth two bns at each nk s unquey dentfabe from end-to-end packet deays, except when the deay p.m.f. s at a nks are dentca. A = n + 3 3 p 3 + + + 3 3 p p p p ( ) ( p ) p p Q Q Q Q p p ( p ) Q p p Q + + p p ( p ) Q p p Q 3 3 Q = p ( p ) + p ( p ) Q = p ( p ) + p ( p ) 0 0 + + 7

Contnuous Deay Mode: Gaussan Mxture Arbtrary shapes of nk deay dstrbutons Let f ( x) be the nk deay p.d.f at nk. k = k m= α φ( x; θ ) m, m, : the number of mxture components. α, m : mxng probabty for the mth component. 0 α, α =, m m =, m φ( x; θ ) : Gaussan densty functon wth mean and, m k { } varance θ = µ, σ, m, m, m 0.5 φ (0, ) + 0.3 φ(, ) + 0. φ(5, 4) 8

Contnuous Deay Mode: Identfabty Probem t k, k, k = Exampe: Two eaf tree. Le 3 { } f( y,y )= φ ( y ; µ + µ, σ + σ ) { + + } φ ( y ; µ µ, σ σ ) 3 3 µ = µ + µ Y Y Y Y µ = µ + µ 3 σ = σ + σ σ = σ + σ 3 source 3 4 equatons wth 6 unknowns! recever : y recever : y 9

Mxed Fnte Mxture Mode ρ Utzaton factor of a queueng system 0 < for stabe system. P(Queue s empty) = = ρ ρ α Introduce a deta component at (or near) 0 wth probabty mass α 0 k Lnk deay p.d.f. becomes f ( x) α δ( x) + α φ( x; θ ),0 m= m, m, Suffcent condton for dentfabty (asymptotc) The deay dstrbuton defned above s dentfabe from end-to-end measurements f () α > 0 for a () A the,0 Gaussan components n nk deay dstrbutons have dstnct means and varances. k m= 0 = α m, = 0 0

Mxed Fnte Mxture Mode: Exampe f ( x ) = 0. δ( x ) + 0.9 φ( x ; 0, ) x f ( x ) = 0.3 δ( x ) + 0.7 φ( x ; 4, ) x f ( x ) = 0.03 δ( x ) + 0.7 φ( x ; 0, ) + 0.07 φ( x ; 4, ) + 0.63 φ( x ; 4, 3) x + x 3 3 3 3 3

EM Estmaton Agorthm: Notatons Assume pror knowedge of Component ndcator vector { } ( n, ) ( n, ) z = x m m, k (, n ) (, n ) (, n ) (, n =,, L, ),0,, k { } z z z z f s generated by the th component, z ( n, ) m, = 0 otherwse { } { ( n, )} { ( n, ) Unobserved data,,, } xz x= x z= z { ( n, )} Observed data y = y { xyz} Compete data,, Parameter vector Θ= αm,, θm, { }

EM Estmaton Agorthm Compete data kehood og L( xz, Θ) og L( xz, Θ ) = L { = : M n= k m= z N z ogα (, n),0,0 + ( n ogα + og φ( x ; θ )) (, n) (, ) m, m, m, Let ω = E z y ; Θ ( n, ) ( n, ) ( n, ) m, m,, Θ t ( n, ) ( n, ) ( n, ) ( n, ) Q ( θ ) = E z og φ( x ; θ ) y ; Θ m, m, m, m, t t 3

E-Step E EM Estmaton Agorthm og L( xz, Θ ) y; Θ = t L = : M n= N { } ω og α + Q ( θ, Θ ) k k ( n, ) ( n, ) m, m, m, m, m= 0 m= t M-step α θ t+ m, = : M : M n= ( n, ) m, arg max Q ( θ, ) t+ N (, n) m, θ : M n= m, N N ω = Θ t 4

Computer Experment Matab Smuaton wth 5000..d. end-to-end deays for each probe path. source Numbers of Gaussan mxture components and true/estmated deta factor α,0 3 Lnk 3 4 5 6 7 k 3 4 5 6 7 α,0 0.5 0.3 0. 0. 0.5 0.3 0. αˆ,0 0.53 0.304 0.099 0.99 0.5 0.33 0.0 5

True (sod) and estmated (dotted) Gaussan mxture components. 6

Concuson and Extensons Concusons Dscusson of dscrete and contnuous deay modes. Proposed mxed fnte Gaussan mxture mode for nk deay. EM agorthm mpementaton wth known mode orders. Extensons Unsupervsed mode order estmaton. Adaptve agorthm for parameter and mode order update. 7

References F. L. Prest, N. G. Duffed, J. Horowtz, D. Towsey, Mutcast-based nference of network-nterna deay dstrbutons, Umass CMPSCI 99-55, 999. Logca mutcast tree. Dscrete nk deays wth fnte eves. Canonca deay tree,.e., there s a nonzero probabty that a probe experences no deay n traversng each nk. Sampe-average approach. Identfbty s proved by showng bjecton mappng exsts from the nk deay dstrbutons to the probabtes of the events n whch the end-to-end deay s no greater than q for at east one recever. Contnuous mode s dscussed, but dentfabty probem s eft open. M. Coates and R. Nowak, Network tomography for nterna deay estmaton, ICASSP 00, Sat Lake Cty, May 00. Logca uncast tree. Dscrete nk deays wth fnte eves. Back-to-back packet par measurements. MLE usng EM-based agorthm. Sequenta Monte Caro trackng of tme varaton. 8