Gaussian distributions and processes have long been accepted as useful tools for stochastic

Size: px
Start display at page:

Download "Gaussian distributions and processes have long been accepted as useful tools for stochastic"

Transcription

1 Chapter 3 Alpha-Stable Random Variables and Processes Gaussian distributions and processes have long been accepted as useful tools for stochastic modeling. In this section, we introduce a statistical model based on the class of symmetric -stable (SS) distributions which iswell-suited for describing signals that are impulsive in nature. A review of the state of the art on stable processes from a statistical point of view is provided by a collection of papers edited by Cambanis, Samorodnitsky and Taqqu [11]. Several statisticians including Cambanis, Zolotarev, Weron, et al. have published extensively on the theory and applications of stable processes. They studied the properties of stable processes [6, 8, 9, 33, 36, 43, 46, 48, 65, 9, 98], their spectral representation [7, 7, 47], as well as prediction and linear ltering problems [9, 1, 35, 37, 6,94]. Textbooks in the area were written by Samorodnitsky and Taqqu [64] and by Janicki and Weron [3]. An extensive review of stable processes from a signal processing point of view can be found in a tutorial paper by Shao and Nikias [69] aswell as in a monogram written by the same authors [5]. 3.1 The Class of Real SS Distributions The symmetric -stable (SS) distribution is best dened by its characteristic function '(!) = exp(! ; j!j ) (3:1) where is the characteristic exponent restricted to the values <, (;1 < < 1) is the location parameter, and ( > ) is the dispersion of the distribution. For values of in the interval (1 ], the location parameter corresponds to the mean of the SS 6

2 alpha=.5... alpha=1 alpha=1.5 alpha=.4 f(x) x Figure 3.1: Standard SS densities. distribution, while for < 1, corresponds to its median. The dispersion parameter determines the spread of the distribution around its location parameter, similar to the variance of the Gaussian distribution. The characteristic exponent is the most important parameter of the SS distribution and it determines the shape of the distribution. A stable distribution is called standard if = and = 1. Clearly, if a random variable X is stable with parameters, then (X ; )= 1= is standard with characteristic exponent. The standard SS density functions for a few values of the characteristic exponent are shown in Figure 3.1. By letting take the values 1 and, we gettwo important special cases of SS distributions, namely, thecauchy ( = 1), and the Gaussian ( = ): Cauchy Gaussian f 1 ( x)= 1 +(x ; ) (3:) f ( x)= p 1 (x ; ) exp [; ]: (3:3) 4 4 7

3 alpha=.5... alpha=1 alpha=1.5 alpha=. f(x) x Figure 3.: A close-up view of the tails of the densities in Figure 3.1. Unfortunately, no closed form expressions exist for general SS distributions other than the Cauchy and the Gaussian. However, power series expansions can be derived for f ( x). In the following, we shall assume that all SS distributions are centered at the origin, i.e., the location parameter =. This is equivalent to the zero-mean assumption for Gaussian distributions. Then, the standard SS density function is given by [69] f (x) = 8 >< >: 1 x P 1k=1 (;1) k;1 ;(k +1)x ;k sin( k k! ) for <<1 1 for =1 (x +1) P 1 1k= (;1) k ;( k+1 k! 1 p exp [; x 4 )xk for 1 << ] for =: (3:4) Although the SS density behaves approximately like a Gaussian density near the origin, its tails decay at a lower rate than the Gaussian density tails. While the Gaussian density has exponential tails, the stable densities have algebraic tails (cf. Figure 3.). The smaller the characteristic exponent is, the heavier the tails of the SS density. This implies that random variables following SS distributions with small characteristic exponents are highly impulsive. It is this heavy-tail characteristic that makes the SS densities appropriate for modeling signals and noise or interference which are impulsive in 8

4 nature. Figure 3.3 depicts some representative time series of SS deviates for comparison purposes. Clearly, several outliers can be observed in the data series when the characteristic exponent takes values other than. As decreases both the occurrence rate and the strength of the outliers increase, resulting to very impulsive processes. SS densities obey two important properties which further justify their role in data modeling: The stability property, which states that the random variables X 1 ::: X n are independent and symmetrically stable with the same characteristic exponent if and only if for any constants a 1 ::: a n, the linear combination P n i=1 a ix i is also SS The generalized central limit theorem, which states that the family of stable distributions contains all limiting distributions of sums of i.i.d. random variables. An important dierence between the Gaussian and the other distributions of the SS family is that only moments of order less than exist for the non-gaussian SS family members. The fractional lower order moments (FLOM's) of a SS random variable with zero location parameter and dispersion are given by: EjXj p = C(p ) p for <p< (3:5) where C(p )= p+1 ;( p+1 );(; p ) p ;(; p ) (3:6) and ;() is the Gamma function. 3. Bivariate Isotropic Stable Distributions Multivariate stable distributions, much like the univariate stable distributions, are characterized by the stability property and the generalized central limit theorem. However, they are much more dicult to describe because they form a nonparametric set [69]. An exception is the family of multidimensional isotropic stable distributions. Here, we concentrate on the two dimensional (bivariate) case which is appropriate for modeling signals and noise in the bearing estimation problem. The characteristic function of a bivariate isotropic -stable distribution has the form '(! 1! ) = exp( ( 1! 1 +! ) ; j!j ) (3:7) 9

5 5 (a) 1 (b) (c) (d) (e) x (f) Figure 3.3: SS time series. (a): =: (Gaussian), (b): =1:95, (c): =1:5, (d): =1: (Cauchy), (e): =:85, (f): =:45. 3

6 q where! =(! 1! ) and j!j =! + 1!. Again here, the parameters and are the characteristic exponent and the dispersion, respectively. The parameters 1 are the location parameters. The distribution is isotropic with respect to the point ( 1 ). Note that the two marginal distributions of the isotropic stable distribution are SS with parameters ( 1 ) and ( ). In the following, we will assume that ( 1 )=( ). The bivariate isotropic Cauchy and Gaussian distributions are special cases for = 1 and =, respectively. As in the case of the univariate SS density function, when 6= 1or 6=, no closed form expressions exist for the density function of the bivariate stable random variable. By q using the polar coordinate r = jxj = x + 1 x, the density function can be written as f (x 1 x )= (r), and can be expressed in a power series expansion form: (r) = 8 >< >: P 1 1k=1 k (;1) k;1 = for <<1 (;(k=+1)) sin( k)( r ) ;k; k! 1= for =1 (r + ) 3= P 1 1k= (;1) k = 1 4 ;( k+ k+1 (k!) )( r 1= ) k for 1 << exp [; r 4 ] for =: (3:8) The density function (r) described aboveisalsoaheavy-tailed function. An expression for the FLOM's, similar to the one for the single-dimensional case, can be found in [98]. If X is a random vector in R n having the isotropic stable distribution with dispersion, then EjXj p = C n (p ) p for <p< (3:9) where C n (p )= p+1 ;( p+n );(; p ) ;( n );(; p ) : (3:1) 3.3 Symmetric Alpha-Stable Processes A collection of random variables fx(t) t T g where T is an arbitrary index set, is said to be a SS stochastic process if for all combinations of distinct indices t 1 ::: t n T, the random variables X(t 1 ) ::: X(t n )arejointly SS with the same characteristic exponent. The stochastic process fx(t) t T g is stationary if the random vectors (X(t 1 ) ::: X(t n )) and (X(t 1 +s) ::: X(t n +s)) are identically distributed for each choice of s t 1 ::: t n T. The family of stable processes has many members with mutually 31

7 exclusive properties. In the following, we present three important types of stable processes that are commonly used. 1) Sub-Gaussian Processes: A stable process fx(t) t T g is said to be an -sub- Gaussian process (-SG(R)) if for distinct indices t 1 ::: t n T,(X(t 1 ) ::: X(t n )) has characteristic function given by '(!) =exp(;[ 1 nx l m=1! l! m R(t l t m )] = ) (3:11) where R(t s) is a positive denite function,! =[! 1 :::! n ] T and takes values in (1 ]. When =,X(t) is a Gaussian process with zero mean and covariance function R(t s). A sub-gaussian process is stationary if and only if R(t s) =R(t ; s) =R(s ; t). Sub-Gaussian processes share many common features with Gaussian processes. In fact, sub-gaussian processes are variance mixtures of Gaussian processes [1]. Specically, if X(t) is-sg(r), then X(t)=S 1= Y (t) (3:1) where S is a positive -stable random variable and independent ofy (t) which is a Gaussian process with zero-mean and covariance function R(t s). An important distinction between Gaussian and sub-gaussian processes is that, while linear spaces of Gaussian random variables may contain non-degenerate independent elements, sub-gaussian random variables cannot be independent [1]. A sub-gaussian process X(t) =S 1= Y (t) is stationary if and only if the Gaussian process Y (t) is stationary. ) Linear Stable Processes: LetfU(n) n= 1 :::g be a family of i.i.d. SS random variables. Then, X(n) = +1X i=;1 a i U(n ; i) (3:13) denes a stationary SS random process if P +1 ;1 ja i j ; < 1 for some < < when <<1, or if P +1 ;1 ja i j < 1 when 1. These processes are called linear stable processes or stable processes with moving-average representation. Examples of linear stable processes include nite-order autoregressive (AR), moving-average (MA) and autoregressive moving-average (ARMA) processes [69]. 3

8 3) Harmonizable Stable Processes: A complex valued SS process X(t) is called harmonizable if it can be expressed as: X(t) = Z 1 ;1 e t! d(!) ;1 <t<1 (3:14) where d(!) is a SS process with independent increments satisfying fejd(!)j p g =p = C(p )(!) d! for all <p< (3:15) and C(p ) is a constant depending on p and and (!) is a nonnegative function called the spectral density of X(t) [7]. The spectral density (!) describes fully the distribution of the process X(t). In another sharp contrast with the Gaussian case, the class of SS harmonizable processes is disjoint from the class of linear processes. In addition, sub- Gaussian processes are neither linear nor harmonizable [1]. In modeling the signals and/or noise for the parameter estimation problem, we need a complex model for the noise samples. We also need to dene quantities which describe correlations between random variables. In the following section, we present the family of isotropic complex SS distributions and describe their correlation properties by means of the covariation quantity. 3.4 Complex SS Random Variables and Covariations A complex random variable (r.v.) X = X 1 + X is symmetric -stable (SS)ifX 1 and X are jointly SS and its characteristic function is written as '(!) = E exp[ <(!X )] = E exp[ (! 1 X 1 +! X )] Z = exp ; j! 1 x 1 +! x j d; X1 X (x 1 x ) (3.16) S where! =! 1 +!, <[] is the real part operator, and ; X1 X is a symmetric measure on the unit sphere S, called the spectral measure of the random variable X. A complex random variable X = X 1 + X is isotropic if and only if (X 1 X ) has a uniform spectral measure. In this case, the characteristic function of X can be written as '(!) =E exp( <[!X ]) = exp(;j!j ) (3:17) 33

9 where ( > ) is the dispersion of the distribution. In the theory of second-order processes, the concept of covariance plays an important role in problems of linear prediction, ltering and smoothing. Since SS processes do not possess nite pth order moments for p, covariances do not exist on the space of SS random variables. Instead, a quantity calledcovariation plays an analogous role for statistical signal processing problems involving SS processes to the role played by covariance in the case of second-order processes. Several complex r.v.'s are jointly SS if their real and imaginary parts are jointly SS. When X = X 1 + X and Y = Y 1 + Y are jointly SS with 1 <, the covariation of X and Y is dened by [X Y ] = Z S 4 (x 1 + x )(y 1 + y ) <;1> d; X1 X Y 1 Y (x 1 x y 1 y ) (3:18) where we use throughout the convention Y <> = jy j ;1 Y : (3:19) It can be shown that for every 1 p<, the covariation can be expressed as a function of moments [9] [X Y ] = EXY <p;1> where Y is the dispersion of the r.v. Y given by EjY j p Y (3:) p= Y = EjY jp C(p ) for <p< (3:1) with Obviously, from (3.) it holds that C(p )= p+1 ;( p+ );(; p ) ;( 1 );(; p ) : (3:) [X X] = X : (3:3) Also, the covariation coecient of X and Y is dened by X Y = [X Y ] [Y Y ] (3:4) 34

10 and by using (3.), it can be expressed as X Y = EXY <p;1> EjY j p for 1 p<: (3:5) The covariation of complex jointly SS r.v.'s is not generally symmetric and has the following properties [6]: P1 If X 1, X and Y are jointly SS, then [ax 1 + bx Y] = a[x 1 Y] + b[x Y] (3:6) for any complex constants a and b. P If Y 1 and Y are independent and X 1, X and Y are jointly SS, then [ax 1 by 1 + cy ] = ab <;1> [X 1 Y 1 ] + ac <;1> [X 1 Y ] (3:7) for any complex constants a, b and c. P3 If X and Y are independent SS, then [X Y ] = P4 Let fu i i =1 ::: ng be independent complex SS r.v.'s with dispersions i. For any complex numbers fa i b i i =1 ::: ng form X = a 1 U a n U n Y = b 1 U b n U n : Then [X X] = 1 ja 1 j + + n ja n j [Y Y ] = 1 jb 1 j + + n jb n j [X Y ] = 1 a 1 b <;1> n a n b <;1> n : (3:8) 35

11 3.5 Generation of Complex Isotropic SS Random Variables The generation of complex isotropic SS deviates of characteristic exponent is based on the following proposition found in [64]: Proposition 3.1 Acomplex SS ( <) random variable X = X 1 + X is isotropic if and only if there exist two i.i.d. zero-mean Gaussian variables G 1 and G and a real stable random variable A of characteristic exponent =, dispersion cos (=4) and skewness =1(we write A S = (cos (=4) 1)), independent of (G 1 G ) such that (X 1 X ) d = (A 1= G 1 A 1= G ). We say that the vector (X 1 X )issub-gaussian with underlying vector (G 1 G ). It can be shown that the real and imaginary parts of X are always dependent, unless G 1 and G are degenerate. Hence, every complex isotropic SS random variable with < can be expressed as X = A 1= (G 1 + G ) (3:9) and its generation involves the generation of a real, totally skewed stable random variable. The problem of generating a real stable deviate is studied in [14] and [3]. Here, we present the result for easy reference. To generate a real standard stable random variable A S (1 )ofcharacteristic exponent, skewness and unit dispersion = 1, the following representations can be deduced: sin (U ; U ) cos(u ; (U ; U )) 1; S( 1) = D for 6= 1 (3:3) (cos U) 1= W and S(1 1) = # "( + U) tan U ; ln( W cos U + U ) (3:31) where W is standard exponential with PrfW > wg = e ;w w >, and U is uniform on (; ). Also, D = [cos(arctan( tan(=)))] ;1=,andU = ; [k()=] with k() =1;j1 ; j. Then, a stable variate, A 1, of dispersion can be obtained from A by A 1 = 1= A. 36

12 To conclude, the following proposition gives the relationship between the dispersion of the complex r.v. X = X 1 + X and the variance of the complex Gaussian r.v. G = G 1 + G. Proposition 3. The dispersion of the complex r.v. X = X 1 + X generated according to Proposition 3.1 is given by =(=), where is the variance of the underlying complex Gaussian random variable. Proof The Laplace transform of the r.v. A S = (cos (=4) 1) is given by [64]: Efexp(;sA)g = exp(;s = ) s>: (3:3) Also, since G = G 1 + G N C ( ), its characteristic function is given by ' G (!) =exp(; j!j =4) (3:33) where! =! 1 +!. Then, the characteristic function of X can be expressed as ' X (!) = Efexp( <[!X ])g = Efexp( (! 1 A 1= G 1 +! A 1= G ))g = EfEfexp( (! 1 A 1= G 1 +! A 1= G ))jagg = Efexp(; (! 1 +! )A=4)g [by use of 3.33] = exp((=) j!j ) [by use of 3.3]: (3.34) Comparing (3.34) with (3.7) we conclude that =(=). 37

Stable Process. 2. Multivariate Stable Distributions. July, 2006

Stable Process. 2. Multivariate Stable Distributions. July, 2006 Stable Process 2. Multivariate Stable Distributions July, 2006 1. Stable random vectors. 2. Characteristic functions. 3. Strictly stable and symmetric stable random vectors. 4. Sub-Gaussian random vectors.

More information

Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes

Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes Anatoliy Swishchuk Department of Mathematics and Statistics University of Calgary Calgary, Alberta, Canada Lunch at the Lab Talk

More information

ELEG 833. Nonlinear Signal Processing

ELEG 833. Nonlinear Signal Processing Nonlinear Signal Processing ELEG 833 Gonzalo R. Arce Department of Electrical and Computer Engineering University of Delaware arce@ee.udel.edu February 15, 25 2 NON-GAUSSIAN MODELS 2 Non-Gaussian Models

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Multivariate Distributions

Multivariate Distributions IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Multivariate Normal-Laplace Distribution and Processes

Multivariate Normal-Laplace Distribution and Processes CHAPTER 4 Multivariate Normal-Laplace Distribution and Processes The normal-laplace distribution, which results from the convolution of independent normal and Laplace random variables is introduced by

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

A method for tting stable autoregressive models using the autocovariationfunction

A method for tting stable autoregressive models using the autocovariationfunction Statistics & Probability Letters 53 (2001) 381 390 A method for tting stable autoregressive models using the autocovariationfunction ColinM. Gallagher Mathematical Sciences, Clemson University, Clemson,

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ). 1 Economics 620, Lecture 8a: Asymptotics II Uses of Asymptotic Distributions: Suppose X n! 0 in probability. (What can be said about the distribution of X n?) In order to get distribution theory, we need

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

On the Chambers{Mallows{Stuck Method. for Simulating. Skewed Stable Random Variables. Rafa l Weron. Hugo Steinhaus Center for Stochastic Methods,

On the Chambers{Mallows{Stuck Method. for Simulating. Skewed Stable Random Variables. Rafa l Weron. Hugo Steinhaus Center for Stochastic Methods, On the Chambers{Mallows{Stuck Method for Simulating Skewed Stable Random Variables. Rafa l Weron Hugo Steinhaus Center for Stochastic Methods, Technical University of Wroc law, Poland. Abstract In this

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

Nonlinear Time Series Modeling

Nonlinear Time Series Modeling Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Teletrac modeling and estimation

Teletrac modeling and estimation Teletrac modeling and estimation File 2 José Roberto Amazonas jra@lcs.poli.usp.br Telecommunications and Control Engineering Dept. - PTC Escola Politécnica University of São Paulo - USP São Paulo 11/2008

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Stable Distributions Models for Heavy Tailed Data

Stable Distributions Models for Heavy Tailed Data Stable Distributions Models for Heavy Tailed Data John P. Nolan jpnolan@american.edu Math/Stat Department American University Copyright c 2009 John P. Nolan Processed May 13, 2009 ii Contents I Univariate

More information

ELEG 833. Nonlinear Signal Processing

ELEG 833. Nonlinear Signal Processing Nonlinear Signal Processing ELEG 833 Gonzalo R. Arce Department of Electrical and Computer Engineering University of Delaware arce@ee.udel.edu February 15, 2005 1 INTRODUCTION 1 Introduction Signal processing

More information

p(z)

p(z) Chapter Statistics. Introduction This lecture is a quick review of basic statistical concepts; probabilities, mean, variance, covariance, correlation, linear regression, probability density functions and

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Modelling Dropouts by Conditional Distribution, a Copula-Based Approach

Modelling Dropouts by Conditional Distribution, a Copula-Based Approach The 8th Tartu Conference on MULTIVARIATE STATISTICS, The 6th Conference on MULTIVARIATE DISTRIBUTIONS with Fixed Marginals Modelling Dropouts by Conditional Distribution, a Copula-Based Approach Ene Käärik

More information

Beyond the color of the noise: what is memory in random phenomena?

Beyond the color of the noise: what is memory in random phenomena? Beyond the color of the noise: what is memory in random phenomena? Gennady Samorodnitsky Cornell University September 19, 2014 Randomness means lack of pattern or predictability in events according to

More information

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions Copyright c 2007 by Karl Sigman 1 Simulating normal Gaussian rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions Fundamental to many applications

More information

Lecture 1: Brief Review on Stochastic Processes

Lecture 1: Brief Review on Stochastic Processes Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.

More information

Robust Subspace DOA Estimation for Wireless Communications

Robust Subspace DOA Estimation for Wireless Communications Robust Subspace DOA Estimation for Wireless Communications Samuli Visuri Hannu Oja ¾ Visa Koivunen Laboratory of Signal Processing Computer Technology Helsinki Univ. of Technology P.O. Box 3, FIN-25 HUT

More information

Uncorrelatedness and Independence

Uncorrelatedness and Independence Uncorrelatedness and Independence Uncorrelatedness:Two r.v. x and y are uncorrelated if C xy = E[(x m x )(y m y ) T ] = 0 or equivalently R xy = E[xy T ] = E[x]E[y T ] = m x m T y White random vector:this

More information

The Laplace driven moving average a non-gaussian stationary process

The Laplace driven moving average a non-gaussian stationary process The Laplace driven moving average a non-gaussian stationary process 1, Krzysztof Podgórski 2, Igor Rychlik 1 1 Mathematical Sciences, Mathematical Statistics, Chalmers 2 Centre for Mathematical Sciences,

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Alpha-Stable Distributions in Signal Processing of Audio Signals

Alpha-Stable Distributions in Signal Processing of Audio Signals Alpha-Stable Distributions in Signal Processing of Audio Signals Preben Kidmose, Department of Mathematical Modelling, Section for Digital Signal Processing, Technical University of Denmark, Building 3,

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Statistical Methods in HYDROLOGY CHARLES T. HAAN. The Iowa State University Press / Ames

Statistical Methods in HYDROLOGY CHARLES T. HAAN. The Iowa State University Press / Ames Statistical Methods in HYDROLOGY CHARLES T. HAAN The Iowa State University Press / Ames Univariate BASIC Table of Contents PREFACE xiii ACKNOWLEDGEMENTS xv 1 INTRODUCTION 1 2 PROBABILITY AND PROBABILITY

More information

Computer Simulation of Levy. Aleksander Weron. and. Rafa l Weron. Technical University ofwroc law, 50{370 Wroc law, Poland.

Computer Simulation of Levy. Aleksander Weron. and. Rafa l Weron. Technical University ofwroc law, 50{370 Wroc law, Poland. Computer Simulation of Levy {Stable Variables and Processes Aleksander Weron and Rafa l Weron The Hugo Steinhaus Center for Stochastic Methods, Technical University ofwroc law, 5{37 Wroc law, Poland. Abstract.

More information

Extremogram and Ex-Periodogram for heavy-tailed time series

Extremogram and Ex-Periodogram for heavy-tailed time series Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal

More information

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

COMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE

COMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE APPLICATIONES MATHEMATICAE 23,1(1995), pp. 83 93 A. WERON(Wroc law) COMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE Abstract. The aim of this paper is to demonstrate how

More information

16 Chapter 3. Separation Properties, Principal Pivot Transforms, Classes... for all j 2 J is said to be a subcomplementary vector of variables for (3.

16 Chapter 3. Separation Properties, Principal Pivot Transforms, Classes... for all j 2 J is said to be a subcomplementary vector of variables for (3. Chapter 3 SEPARATION PROPERTIES, PRINCIPAL PIVOT TRANSFORMS, CLASSES OF MATRICES In this chapter we present the basic mathematical results on the LCP. Many of these results are used in later chapters to

More information

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016 Lecture 3 Probability - Part 2 Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza October 19, 2016 Luigi Freda ( La Sapienza University) Lecture 3 October 19, 2016 1 / 46 Outline 1 Common Continuous

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

HANDBOOK OF APPLICABLE MATHEMATICS

HANDBOOK OF APPLICABLE MATHEMATICS HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume II: Probability Emlyn Lloyd University oflancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester - New York - Brisbane

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Nonlinear Signal Processing ELEG 833

Nonlinear Signal Processing ELEG 833 Nonlinear Signal Processing ELEG 833 Gonzalo R. Arce Department of Electrical and Computer Engineering University of Delaware arce@ee.udel.edu May 5, 2005 8 MYRIAD SMOOTHERS 8 Myriad Smoothers 8.1 FLOM

More information

Fundamentals of Applied Probability and Random Processes

Fundamentals of Applied Probability and Random Processes Fundamentals of Applied Probability and Random Processes,nd 2 na Edition Oliver C. Ibe University of Massachusetts, LoweLL, Massachusetts ip^ W >!^ AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS

More information

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna

More information

Stationary particle Systems

Stationary particle Systems Stationary particle Systems Kaspar Stucki (joint work with Ilya Molchanov) University of Bern 5.9 2011 Introduction Poisson process Definition Let Λ be a Radon measure on R p. A Poisson process Π is a

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007 Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ)

More information

Multiplicative Multifractal Modeling of. Long-Range-Dependent (LRD) Trac in. Computer Communications Networks. Jianbo Gao and Izhak Rubin

Multiplicative Multifractal Modeling of. Long-Range-Dependent (LRD) Trac in. Computer Communications Networks. Jianbo Gao and Izhak Rubin Multiplicative Multifractal Modeling of Long-Range-Dependent (LRD) Trac in Computer Communications Networks Jianbo Gao and Izhak Rubin Electrical Engineering Department, University of California, Los Angeles

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Confidence Intervals, Testing and ANOVA Summary

Confidence Intervals, Testing and ANOVA Summary Confidence Intervals, Testing and ANOVA Summary 1 One Sample Tests 1.1 One Sample z test: Mean (σ known) Let X 1,, X n a r.s. from N(µ, σ) or n > 30. Let The test statistic is H 0 : µ = µ 0. z = x µ 0

More information

PREDICTION AND NONGAUSSIAN AUTOREGRESSIVE STATIONARY SEQUENCES 1. Murray Rosenblatt University of California, San Diego

PREDICTION AND NONGAUSSIAN AUTOREGRESSIVE STATIONARY SEQUENCES 1. Murray Rosenblatt University of California, San Diego PREDICTION AND NONGAUSSIAN AUTOREGRESSIVE STATIONARY SEQUENCES 1 Murray Rosenblatt University of California, San Diego Abstract The object of this paper is to show that under certain auxiliary assumptions

More information

Delta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0)

Delta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0) Delta Method Often estimators are functions of other random variables, for example in the method of moments. These functions of random variables can sometimes inherit a normal approximation from the underlying

More information

User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series

User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series Hannes Helgason, Vladas Pipiras, and Patrice Abry June 2, 2011 Contents 1 Organization

More information

SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions

SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables

Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables Jaap Geluk 1 and Qihe Tang 2 1 Department of Mathematics The Petroleum Institute P.O. Box 2533, Abu Dhabi, United Arab

More information

Extremogram and ex-periodogram for heavy-tailed time series

Extremogram and ex-periodogram for heavy-tailed time series Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Maximum Likelihood Estimation. only training data is available to design a classifier

Maximum Likelihood Estimation. only training data is available to design a classifier Introduction to Pattern Recognition [ Part 5 ] Mahdi Vasighi Introduction Bayesian Decision Theory shows that we could design an optimal classifier if we knew: P( i ) : priors p(x i ) : class-conditional

More information

Part IV Stochastic Image Analysis 1 Contents IV Stochastic Image Analysis 1 7 Introduction to Stochastic Processes 4 7.1 Probability................................... 5 7.1.1 Random events and subjective

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

CS Lecture 19. Exponential Families & Expectation Propagation

CS Lecture 19. Exponential Families & Expectation Propagation CS 6347 Lecture 19 Exponential Families & Expectation Propagation Discrete State Spaces We have been focusing on the case of MRFs over discrete state spaces Probability distributions over discrete spaces

More information

Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 770

Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 770 Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 77 Jonathan B. Hill Dept. of Economics University of North Carolina - Chapel Hill October 4, 2 MATHEMATICAL

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

ECE 673-Random signal analysis I Final

ECE 673-Random signal analysis I Final ECE 673-Random signal analysis I Final Q ( point) Comment on the following statement "If cov(x ; X 2 ) 0; then the best predictor (without limitation on the type of predictor) of the random variable X

More information

The Bias-Variance dilemma of the Monte Carlo. method. Technion - Israel Institute of Technology, Technion City, Haifa 32000, Israel

The Bias-Variance dilemma of the Monte Carlo. method. Technion - Israel Institute of Technology, Technion City, Haifa 32000, Israel The Bias-Variance dilemma of the Monte Carlo method Zlochin Mark 1 and Yoram Baram 1 Technion - Israel Institute of Technology, Technion City, Haifa 32000, Israel fzmark,baramg@cs.technion.ac.il Abstract.

More information

Marshall-Olkin Bivariate Exponential Distribution: Generalisations and Applications

Marshall-Olkin Bivariate Exponential Distribution: Generalisations and Applications CHAPTER 6 Marshall-Olkin Bivariate Exponential Distribution: Generalisations and Applications 6.1 Introduction Exponential distributions have been introduced as a simple model for statistical analysis

More information

Prediction of Stable Stochastic Processes

Prediction of Stable Stochastic Processes Prediction of Stable Stochastic Processes Evgeny Spodarev Institute of Stochastics Workshop Probability, Analysis and Geometry, 2-6.09.2013 Seite 2 Prediction of Stable Stochastic Processes Evgeny Spodarev

More information

The Identification of ARIMA Models

The Identification of ARIMA Models APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of

More information

EEG- Signal Processing

EEG- Signal Processing Fatemeh Hadaeghi EEG- Signal Processing Lecture Notes for BSP, Chapter 5 Master Program Data Engineering 1 5 Introduction The complex patterns of neural activity, both in presence and absence of external

More information

1 Review of di erential calculus

1 Review of di erential calculus Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts

More information

Weak Limits for Multivariate Random Sums

Weak Limits for Multivariate Random Sums Journal of Multivariate Analysis 67, 398413 (1998) Article No. MV981768 Weak Limits for Multivariate Random Sums Tomasz J. Kozubowski The University of Tennessee at Chattanooga and Anna K. Panorska - The

More information

JUST THE MATHS UNIT NUMBER DIFFERENTIATION APPLICATIONS 5 (Maclaurin s and Taylor s series) A.J.Hobson

JUST THE MATHS UNIT NUMBER DIFFERENTIATION APPLICATIONS 5 (Maclaurin s and Taylor s series) A.J.Hobson JUST THE MATHS UNIT NUMBER.5 DIFFERENTIATION APPLICATIONS 5 (Maclaurin s and Taylor s series) by A.J.Hobson.5. Maclaurin s series.5. Standard series.5.3 Taylor s series.5.4 Exercises.5.5 Answers to exercises

More information

Applied Probability and Stochastic Processes

Applied Probability and Stochastic Processes Applied Probability and Stochastic Processes In Engineering and Physical Sciences MICHEL K. OCHI University of Florida A Wiley-Interscience Publication JOHN WILEY & SONS New York - Chichester Brisbane

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Regression and Statistical Inference

Regression and Statistical Inference Regression and Statistical Inference Walid Mnif wmnif@uwo.ca Department of Applied Mathematics The University of Western Ontario, London, Canada 1 Elements of Probability 2 Elements of Probability CDF&PDF

More information

Neighbourhoods of Randomness and Independence

Neighbourhoods of Randomness and Independence Neighbourhoods of Randomness and Independence C.T.J. Dodson School of Mathematics, Manchester University Augment information geometric measures in spaces of distributions, via explicit geometric representations

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform

More information

Bayesian Inference for the Multivariate Normal

Bayesian Inference for the Multivariate Normal Bayesian Inference for the Multivariate Normal Will Penny Wellcome Trust Centre for Neuroimaging, University College, London WC1N 3BG, UK. November 28, 2014 Abstract Bayesian inference for the multivariate

More information

Probability Lecture III (August, 2006)

Probability Lecture III (August, 2006) robability Lecture III (August, 2006) 1 Some roperties of Random Vectors and Matrices We generalize univariate notions in this section. Definition 1 Let U = U ij k l, a matrix of random variables. Suppose

More information

Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques

Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Hung Chen hchen@math.ntu.edu.tw Department of Mathematics National Taiwan University 3rd March 2004 Meet at NS 104 On Wednesday

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

290 J.M. Carnicer, J.M. Pe~na basis (u 1 ; : : : ; u n ) consisting of minimally supported elements, yet also has a basis (v 1 ; : : : ; v n ) which f

290 J.M. Carnicer, J.M. Pe~na basis (u 1 ; : : : ; u n ) consisting of minimally supported elements, yet also has a basis (v 1 ; : : : ; v n ) which f Numer. Math. 67: 289{301 (1994) Numerische Mathematik c Springer-Verlag 1994 Electronic Edition Least supported bases and local linear independence J.M. Carnicer, J.M. Pe~na? Departamento de Matematica

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Elec4621 Advanced Digital Signal Processing Chapter 11: Time-Frequency Analysis

Elec4621 Advanced Digital Signal Processing Chapter 11: Time-Frequency Analysis Elec461 Advanced Digital Signal Processing Chapter 11: Time-Frequency Analysis Dr. D. S. Taubman May 3, 011 In this last chapter of your notes, we are interested in the problem of nding the instantaneous

More information