A review of methods to calculate extreme wind speeds

Size: px
Start display at page:

Download "A review of methods to calculate extreme wind speeds"

Transcription

1 Meteorol. Appl. 6, (1999) A review of methods to calculate extreme wind speeds J P Palutikof 1, B B Brabson 2, D H Lister 1 and S T Adcock 3 1 Climatic Research Unit, University of East Anglia, Norwich NR4 7TJ, UK 2 Department of Physics, Indiana University, Bloomington, Indiana 47405, USA 3 Department of Meteorology, University of Reading, Reading RG6 6BB, UK Methods to calculate extreme wind speeds are described and reviewed, including classical methods based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD), and approaches designed specifically to deal with short data sets. The emphasis is very much on the needs of users who seek an accurate method to derive extreme wind speeds but are not fully conversant with up-to-date developments in this complex subject area. First, standard methods are reviewed: annual maxima, independent storms, r-largest extremes with the GEV distribution, and peak-over-threshold extremes with the GPD. Techniques for calculating the distribution parameters and quantiles are described. There follows a discussion of the factors which must be considered in order to fulfil the criterion that the data should be independent and identically distributed, and in order to minimize standard errors. It is commonplace in studies of extreme wind speeds that the time series available for analysis are very short. Finally, therefore, the paper deals with techniques applicable to data sets as short as two years, including simulation modelling and methods based on the parameters of the parent distribution. 1. Background High wind speeds pose a threat to the integrity of structures, particularly those at exposed sites such as bridges, wind turbines and radio masts. In any design project for large structures, safety considerations must be balanced against the additional cost of over-design. Accurate estimation of the occurrence of extreme wind speeds is an important factor in achieving the correct balance. Such estimates are commonly expressed in terms of the quantile value X T, i.e., the maximum wind speed which is exceeded, on average, once every T years, the return period. Typically, for most users of wind data, estimates of the 50-year return period gust or wind speed are required, based on (and often fewer) available years of observations. For this situation, the data are generally fitted to a theoretical distribution in order to calculate the quantiles. Fisher & Tippett (1928) showed that if a sample of n cases is chosen from a parent distribution, and the maximum (or minimum) of each sample is selected, then the distribution of the maxima (or minima) approaches one of three limiting forms as the size of the samples increases. Gumbel (1958) argued that, in the case of floods, each year of record constitutes a sample with 365 cases and that the annual extreme flood is the maximum value of the sample. Thus, the Fisher Tippett distributions could be fitted to the set of annual maxima. This is the basis of all classical extreme value theory. The aim is to define the form of the limiting distribution and estimate the parameters, so that values of X T can be calculated. Here, we review and summarize the portion of the extensive literature on extreme value theory relevant to the analysis of wind and gust speed data. The review is carried out with reference to the requirements of a user seeking to select and apply a method appropriate to a particular real data set. We discuss first the available techniques and then the choices required for successful selection and implementation. The methods are organized broadly according to the length of the time series of observations available for analysis, beginning with those which require longer data sets, and moving towards those developed for use with short-duration series. The length of the observational data set is often a problem for those wishing to calculate extreme wind speed quantiles but is a common concern for users of geophysical data, and a substantial literature exists on estimation of extremes from short time series. Note that we do not consider here the matter of anemometer exposure. As with all analyses of wind data, the results of an extreme value analysis will be flawed if the data on which they are based are taken from an anemometer with a non-standard exposure (e.g., sheltered in one or more directions, or at a height above the ground different from the 10 m standard). 119

2 J P Palutikof, B B Brabson, D H Lister and S T Adcock 2. The generalized extreme value (GEV) distribution Classical extreme value theory describes how, for sufficiently long sequences of independent and identically distributed random variables (of which more later), the maxima of samples of size n, for large n, can be fitted to one of three basic families. These three families were combined into a single distribution by Von Mises (1936, in French; see Jenkinson (1955) for an explanation in English), now universally known as the generalized extreme value (GEV) distribution. This has the cumulative distribution function: 1/ k F( x) = exp[ ( 1 ky) ] k 0 ( 1a ) Fx ( ) = exp[ exp( y)] k= 0 ( 1b) where k is a shape parameter which determines the type of extreme value distribution. Fisher Tippett Type 1 has k = 0, and is also known as the Gumbel distribution. Type II has a negative value of k. Both Type I and Type II are unbounded at the upper end, but Type II has a thicker tail than Type I. Type III has a positive value of k and is bounded at the upper end. The standardized or reduced variate, y, is given by: x β y = α where β is the mode of the extreme value distribution (also known as the location parameter) and α is the dispersion (or scale parameter). For the quantile X T with return period T, the cumulative probability is given by F(X T ) = 1 (1/T). Combining equations (1) and (2) and solving for x gives the following expression for X T : ( 2) (Cook, 1985). Immediately, the calculation of the quantiles is simplified. From equation (3b), only two parameters are required: the location and scale parameters. A graphical solution is often preferred. From equation (2): x = αy+ β ( 4) Thus, on a plot of x on the ordinate against y on the abscissa (see, for example, Figure 1), the intercept will give an estimate of the mode, and the slope gives the dispersion. The first step in the calculation involves selecting from the time series of observations the maximum value for each epoch or period (normally taken to be one year). For each of these annual maxima, it is then necessary to estimate a value of y. From equation (1b) we define the Gumbel reduced variate: y Gumbel = ln { ln [ F( x)]} ( 5) where F(x) is the probability that an annual maximum wind speed is less than x. F(x) can be estimated for each of the observed annual maxima by simply ordering the data from the smallest (x 1 ) to the largest (x N ), and calculating an empirical value of F(x m ) from the ranked position of x m. These estimates are known as the plotting positions. Choosing plotting positions which lead to unbiased quantile estimates is not straightforward, and the literature is large, with at least ten published formulae (Guo, 1990). A number of studies of wind extremes use: m Fx ( m )= N + 1 ( 6) X T X k = β + α 1 1 ln 1 k 0 ( 3a) k T T 1 = β α ln ln 1 k = 0 ( 3b) T 2.1. A graphical method to find the parameters of the Gumbel distribution The type of GEV distribution is determined by the form of the parent distribution. The parent distributions of Type I extremes (i.e., with k = 0) include the Weibull distribution. Since it is widely accepted that the Weibull probability density function is a good model for wind speed distributions (Hennessey, 1977), extremes of wind speeds are often modelled by Type I 120 Figure 1. A typical Gumbel plot, based on Sumburgh (Shetland) annual maxima for The Gumbel confidence limits for one and two standard errors (ESDU 1988; 1990) are shown.

3 Methods to calculate extreme wind speeds for a sample of N annual maxima (Cook, 1985; ESDU, 1990). However, for the special case of the Gumbel distribution an almost unbiased plotting position is given by: m 044. Fx ( m )= N (Gringorten, 1963) and is to be preferred. Equation (7) tends to give lower extreme wind speeds for the same return period than equation (6) (Hannah et al., 1996). To implement the procedure, for each value of x, F(x m ) is calculated and hence y Gumbel. A graph can be constructed which is essentially the cumulative distribution function with the axes reversed, and linearized by plotting the reduced variate against values of x rather than F(x) against x. Figure 1 is an example of such a Gumbel plot. The horizontal axis shows both the Gumbel reduced variate y Gumbel and the corresponding return period, T = 1/[1 F(x)]. The next step is to fit a straight line to the plotted points. This may be done by eye or by using a leastsquares fit. Then, the parameters α and β can be found. Substitution in equation (3b) (for k = 0) allows calculation of the required quantile values Refinements to the method ( 7) A number of authors (e.g., Cook, 1985; Harris, 1996) have suggested that, for the calculation of wind extremes, a power of the wind speed be used rather than the wind speed itself. That is, the standard variable x is better derived from V w, where V is the epoch-maximum gust or wind speed and w is the shape parameter of the parent Weibull distribution, equal to 2 for a Rayleigh distribution. Then, V w has a distribution which at the high end is closer to exponential than V alone. Hence, the distribution of V w extremes converges faster to the Gumbel (Fisher Tippett Type I) distribution, which is also exponential in form (Cook, 1982). Failure to converge may result in false indications of the choice of asymptote, i.e., the Gumbel plot will be curved, suggesting a Type III distribution (see Figure 2) when the real reason for the curvature is slow convergence. Rapid convergence is therefore very important, and taking the square of the wind speeds enhances the chances of achieving this. Although taking the square of the extremes makes no difference to their rank order, it does affect the fitting of the straight line in Figure 1 and hence the parameters of the Gumbel model. In this case, X T in equation (3) will equal the square root of the right-hand side. Using a least-squares method to find the best-fit line gives equal weight to each of the plotted points. However, this approach is often considered unsuitable for extreme value data, since the error associated with each point varies systematically, being greatest for the largest extremes (Harris, 1996). Various alternative fitting strategies have therefore been devised. One method widely used for wind extremes has been the Lieblein BLUE (best linear unbiased estimators) method (Lieblein, 1974, and described in detail by Cook, 1985). Cook provides look-up tables with values of the parameters required for the calculation of BLUE for sample sizes ranging from n = 10 to n = 24. However, beyond around n = 30, the calculation becomes progressively more intractable, limiting the application of the method (Buishand, 1989). Hüsler & Schüpbach (1986) present a simplified procedure for calculating the BLUE. Harris (1996) suggests the use of a weighted least-squares procedure, which gives results very similar to the BLUE method, but is more readily programmed Alternative methods of parameter estimation For the case where k 0, or where k is unknown, the solution is less simple since three parameters must be found, α, β and k. Figure 2 shows the standard Gumbel plot with two added curves for positive and negative values of k. Note the strong dependence of the quantiles on the value of k and the small range of X T for positive k. Figure 2. The three basic families of limiting distribution for the generalized extreme value distribution, here shown for a representative set of annual maximum gust speeds for Sumburgh (Shetland) for There are a number of numerical methods for the determination of the three parameters α, β and k, but the most commonly described in the literature are probability weighted moments (PWMs) and maximum likelihood (ML) solutions. We describe here the PWM method. The ML method is described in detail elsewhere (Davison, 1984; Smith, 1986; Hosking & Wallis, 1987; Davison & Smith, 1990; Wilks, 1995). 121

4 J P Palutikof, B B Brabson, D H Lister and S T Adcock Consider a set of PWMs: R B = E [ xf( x) ], R= 012,,... ( 8) R Unbiased estimates of the first three PWMs are given by: b 2 = b 1 n 2 j = 1 = b 0 n 1 j = 1 = X ( n j) Xj nn ( 1) ( n j)( n j 1) X nn ( 1)( n 2) j ( 9a) ( 9b) ( 9c) whether a particular set of extremes follow a Gumbel or GEV distribution, based on the PWM estimator of k Drawbacks and alternative approaches The principal drawback to the classical GEV/Gumbel method is that only one value is selected per epoch. This reduces the data available for analysis, such that the underlying data set from which the epochal extremes are drawn must be long: Cook (1985) suggests at least 20 years of data for reliable results (i.e. 20 extremes for analysis), and states that the method should not be employed with fewer than 10 years. To increase the number of cases for analysis, alternative approaches have been developed as follows: where n is the sample size. An approximate estimator of the shape parameter for the GEV distribution is given by: ˆ 2 2b1 b0 k= c c c = 3b b 2 0 ln 2 ln 3 ( 10) (Hosking et al., 1985). Then, the shape and location parameters can be estimated from: ( ) 2b ˆ 1 b0 k αˆ = Γ 1+ kˆ 1 2 ( ) kˆ ( ) ( 11a ) (a) (b) (c) The concept of a single extreme per epoch has been extended to include the r-largest values. The method of independent storms (MIS) uses a lull, or period of wind speeds below a selected (low) threshold, to separate storms. Then, the highest extreme is selected from each storm. After further pre-processing (see section 4), the sample of extremes is fitted to the GEV. Peak-over-threshold (POT) maxima, extracted from sample data series to produce a series of extreme values above a chosen (high) threshold, have been used with the generalized Pareto distribution (GPD). ( ) αˆ Γ 1+ kˆ 1 βˆ = b + 0 kˆ ( 11b) where Γ is the gamma function (see, e.g., Press et al., 1992, for methods of calculation). Alternative methods for deriving the estimators of B R exist (see, for example, Hosking et al., 1985, who use plottingposition estimators). A related technique for parameter estimation is known as the L-moments method (Hosking, 1990). As explained by Stedinger et al. (1993), the first R L- moments are simply linear combinations of the first R PWMs and will give the same parameter estimates. It is, of course, possible to evaluate the parameters of the Gumbel distribution using PWMs rather than the graphical approach described in section 2.1 (see, for example, Abild et al., 1992a). Expressions for the PWM-estimators of the Gumbel distribution are (Stedinger et al., 1993): αˆ = b b ( 12a ) 1 2 The great advantage of using the GEV distribution with annual maxima is that few decisions are required during the calculation of the distribution parameters and quantiles. With the above methods, decisions must be made which can have a strong impact on the estimated values of the distribution parameters and quantiles. First, it is necessary to decide on the size of r, for the r-largest method, or the threshold value, for the MIS and POT methods. Second, for extreme value theory to be applied successfully, the extremes must be independent and identically distributed. For the GEV distribution based on annual maxima, this is likely to be the case unless the maxima for two years are drawn from a single storm persisting from late December to early January, and this unlikely eventuality should be guarded against. However, for r-largest and POT samples, the probability of extracting dependent extremes becomes very high, because of the strong serial correlation in wind data, if no steps are taken to maintain independence. A minimum separation distance between extremes is usually employed to ensure independence, and the second decision required is the length of that separation distance. The MIS method avoids this problem by creating a set of independent storms from which single extremes are selected. βˆ = b b ( 12b) 0 1 Hosking et al. (1985) provide a simple test to determine 122

5 3. Extending the classical method to the r-largest values The underlying theory for this method is set out by Weissman (1978). Beginning with a sample of the r- largest values X 1 > X 2 > > X r in a single epoch, or year, the joint probability density function for this sample is given by: r r f ( x1... xr ) = α exp exp( Zr) Zj k= 0 ( 13a) j = 1 r 1/ k f ( x1... xr ) = α exp ( 1 kzr ) r ln( 1 kzj ) k 0 ( 13b) k j = 1 where : Z Z r j = ( x β)/ α If we assume that data for separate years are independent then the product of these joint densities is approximately the joint density distribution for the full set of observations, X n1,, X nr, for each of N years (1 < n < N). Smith (1986) gives the likelihood functions for this product, from which estimates of α and β can be computed numerically using ML methods. Quantile estimates are then given by equation (3). In applying this method, it is necessary to decide, first, how to maintain the independence of extremes and, second, the size of r (Tawn, 1988). The extremes are more likely to be independent if r is kept small (Smith, 1986). Coles & Walshaw (1994), in an analysis of a sixyear record of hourly maximum gust speed data for Sheffield, set r = 10 with a minimum separation distance of 60 hours. Wang (1995) examines two criteria to test for optimal r for the Gumbel distribution: generalized least-squares and the Shapiro Wilk goodness-offit test. He prefers the latter. Tawn (1988) emphasises that parameter and quantile estimates based on the r-largest annual events should be more accurate that those derived from the classical method. He argues that, since α and β are independent of r, they are precisely the same parameters as would be estimated by the classical method. Thus, provided the r-largest independent maxima can be extracted, more relevant data are being used. 4. The method of independent storms (MIS) r = ( x β)/ α J Methods to calculate extreme wind speeds The Cook (1982; 1985) method of independent storms (MIS) increases the number of extremes available for analysis, whilst ensuring their independence by separating the parent time series of wind speeds into independent storms and then selecting the highest value from each storm. This is achieved by, first, applying a long-duration low-pass filter to the data (by calculating non-overlapping ten-hour means). The filtered data set is then searched to find downward crossings of the 5 ms 1 threshold, defined as the start of a lull. Between each crossing there will be a storm, which will be independent of the preceding and succeeding events because of the intervening lull. Cook (1982) finds a typical storm frequency rate to be around 100 events/year. The original unfiltered data set is then searched for the maximum wind speed in each storm, thus creating the basic data set of extremes for analysis. Cook (1982) uses the method described in section 2.1 to fit the selected extremes to a GEV Type I (Gumbel) distribution. He transforms wind speed to dynamic pressure (0.5ρV 2 where ρ is air density) in order to achieve a rapid rate of convergence. However, rapid convergence is already assisted because of the twostage procedure for extracting extremes. The data set of storms (from which the extremes are drawn) is a subset from the original parent, and has a parent distribution of its own which is much closer to, and therefore converges much faster towards, the Type I asymptote. The best-fit line is found using the BLUE technique (section 2.2). Cook shows how the BLUE coefficients for ten maxima can be used to properly weight the much larger sample obtained using the MIS. He also investigates the effect of varying the lull threshold to reduce the number of selected storms. He finds that a threshold yielding an annual rate of 10 events/year gives a reliable estimate of the 50-year extreme. Harris (1998) introduces two improvements to the MIS. First, to avoid systematic errors, he modifies the plotting position used by Cook (1982). Second, he substitutes his own method to find the best-fit line (Harris, 1996) for the Lieblein BLUE method. This avoids the need for data reduction, which Cook found necessary in order to use the Lieblein look-up tables. 5. Peak-over-threshold methods with the GPD Theory developed in recent years allows analysis of all values exceeding a specified threshold. The methods are known as peak over threshold (POT) or, particularly in hydrology, partial duration series. An asymptotic distribution, the generalized Pareto distribution (GPD), is used to describe the behaviour of the events above the specified threshold. Like the GEV distribution, the GPD has a shape parameter (k) and a scale parameter (α). The maxima of samples of events from the GPD distribution are GEV distributed and have a shape parameter equal to the shape parameter of the parent GPD. For sufficiently high thresholds, the number of observations above threshold per year (the crossing rate) is low and Poisson distributed. The cumulative distribution function for the GPD is: 123

6 J P Palutikof, B B Brabson, D H Lister and S T Adcock where ξ is the selected threshold, i.e., the values of x ξ are the exceedances. For k = 0, the GPD is just an exponential distribution: In order to calculate the quantiles, it is necessary to estimate the crossing rate of the threshold. If the exceedance process is assumed to be Poisson with rate λ per year, an unbiased estimate of λ is n/m where n is the total number of exceedances over the selected threshold ξ, and M is the number of years of record. Then (Abild et al., 1992a): Although straightforward graphical procedures do not exist, numerical estimation of the distribution parameters is in fact simpler than for the GEV, since ξ is known and only α and k (where k 0) must be found. Using the PWM method, only the first two PWM estimators in equation (9) are required. Then, the estimates of k and α are given by and are valid within the range 0.5 < k < 0.5 (Abild et al., 1992a). We know of no data set of wind extremes where these limits are exceeded. For the case k = 0, of course, αˆ =b 0. For ML solutions, both Barrett (1992) and Davison & Smith (1990) give the log-likelihood to be maximized and provide guidance on solution procedures. Extreme value theory rests on the assumption of independence in the underlying observations. For POT methods, independence requires a suitable combination of both threshold and minimum separation time between events. With a high threshold, the separation can be reduced without compromising independence, but with a low threshold the separation must be increased if independence is to be maintained (Walshaw, 1994). The extremal index is a quantitative measure of the degree of clustering (non-poisson behaviour) in a set of extremes and can serve as a valuable tool in determining when events are independent. Calculation methods are discussed by Smith & Weissman (1994). 124 X T 1/ k k Fx ( ) = 1 1 ( x ξ) ( 14) α ( x ξ) Fx ( ) = 1 exp ( 15) α α k = ξ + 1 ( λt) k 0 ( 16a) k X = ξ + αln( λt) k= 0 ( 16b) T [ ] ˆ b0 k = 2 ( 17a) 2b b 1 0 ( ) αˆ = 1+ kb ˆ ( 17b) Threshold selection On the one hand, the threshold must be set high enough that only true peaks, with Poisson arrival rates, are selected. If this is not the case, the distribution of selected extremes will fail to converge to the GPD asymptote. The result may be a false indication of the correct asymptote (e.g., a Type III distribution may be indicated instead of the correct Type I). On the other, the threshold must be set low enough to ensure that enough data are selected for satisfactory determination of the distribution parameters (Abild et al., 1992a). The choice of threshold must be appropriate for the climatology of the site. For the UK, Cook (1985) recommends a threshold of 15 ms 1 for hourly-mean wind speeds and 30 ms 1 for gusts. However, for the spatially variable wind climatology of the UK, where annual mean wind speeds at 10 m above the ground range between 3 and 4 ms 1 in inland areas of south-east England to over 10 ms 1 in exposed northern uplands, it is unlikely that a single threshold will be appropriate for all sites. Brabson & Palutikof (1998) find for Sumburgh, Shetland, that a threshold in the range ms -1 is most suitable, depending on the separation time. Various techniques exist to aid threshold selection. For example, conditional mean exceedance (CME) graphs, also known as mean residual life graphs, plot the mean excess over threshold as a function of threshold (Davison, 1984; Ledermann et al., 1990). For a GPD distributed spectrum, the CME graph should plot as a straight line. An appropriate threshold can be chosen by selecting the lowest value above which the graph is a straight line. Walshaw (1994) defines a modification of the CME where clustering of the mean exceedances is removed for each choice of threshold. This so-called reclustered excess graph also exhibits linear behaviour for a GPD spectrum Separation time With POT (and r-largest) methods, it is customary to assign a minimum separation time to ensure the independence of extremes. A number of authors use a separation of 48 hours for European wind climates (e.g. Cook, 1985; Gusella, 1991). Walshaw (1994) uses 60 hours for Sheffield wind data. 6. Standard errors and confidence limits For both the GEV distribution and GPD, uncertainty in the estimation of distribution parameters and quantiles arises from causes such as inefficiencies in determining model parameters and arbitrary choice of statistical model. The precision of the estimates of the distribution parameters and quantiles is expressed by the standard error, which is the square root of the vari-

7 Methods to calculate extreme wind speeds ance of the error in the estimates. Small standard errors are an important criterion in assessing the success (or otherwise) of an extreme value analysis. Calculation of the standard errors of the distribution parameters is fairly straightforward (Abild et al., 1992a). For the standard error of the quantile estimates, the equations for the case k = 0 are, for the GEV distribution: [ ] = + SE X ˆ α 2 T Z Z n (... ) ( 18 ) where n is the sample size and: 1 Z = ln ln 1. T For the GPD the corresponding standard error is: [ ] = + [ ] SE X ˆ 2 α T T n ln( ) λ ( 19 ) For k 0, Abild et al. (1992a) provide a helpful discussion and graphs of the normalized standard error as a function of k and the return period. Hosking & Wallis (1987) describe clearly the relationship between the standard error and confidence limits. An alternative approach to estimating precision is to use Monte Carlo simulation. Suppose a data set of extremes has been used to calculate the distribution parameters of either the GEV distribution or GPD. These distribution parameters are denoted here by the vector a 0. Then, these calculated parameters of the original distribution are used to generate, say, 100 synthetic time series, each of which can in turn be used to calculate a new set of distribution parameters (a 1, a 2,, a 100 ). Assuming that a i a 0 is a reasonable surrogate for a i a true, the standard error (i.e., the standard deviation of the values a 1 a 0 a 100 a 0 ) can be calculated. This method can also be used to fit confidence limits to the parameter estimates without assuming that the errors are normally distributed (see Press et al., 1992, section 15.6). The use of Monte Carlo techniques to estimate uncertainties in the distribution parameters and quantile estimates of extremes is widespread (e.g., Zwiers & Ross, 1991; Guttman et al., 1993; Wilks, 1993; Lye & Lin, 1994; Fill & Stedinger, 1995). For wind speeds, the method has been used by Zwiers (1987) and Lechner et al. (1992; 1993). A rapid but reliable approach to Monte Carlo simulation is to use bootstrapping. The synthetic data sets are generated by random sampling, with replacement, from the original data set. Replacement (i.e., after each data point is randomly sampled from the time series, it is replaced before the next point is drawn) means that the synthetic data sets can vary from the original because of 1 1 the possibility of duplication, and therefore the parameter and quantile estimates can differ. 7. Choices at the application stage 7.1. Distribution type Traditionally, extreme gust and wind speed data for temperate latitudes are fitted to a Type I distribution (e.g., Cook, 1985; Ross, 1987; Gusella, 1991; Abild et al., 1992a). The justification for this is twofold. First, the parent distributions of Type I extremes include the Weibull distribution, commonly followed by both wind and gust speeds (see, e.g., Hennessey, 1977). Second, the Type I distribution is unbounded at the upper end, whereas the Type III form is bounded from above, by β + (α/k) for the GEV distribution and by ξ + (α/k) for the GPD. It is argued that there is no natural upper bound to wind speed anywhere approaching the orders of magnitude at which wind speeds are naturally observed. The assumption of a Type I distribution has the added attraction that, since k is set equal to zero, the solution is simpler. A number of authors have disputed this assumption. The choice is between the Type I and Type III form the Type II distribution is seldom found and is generally indicative of a mixed wind series which should be decomposed (Gomes & Vickery, 1978). Walshaw (1994) makes a strong case for the use of Type III distributions for the upper tails of wind speed series. Analysing a ten-year series of hourly gust data, he rejects the possibility that k is zero on the basis of likelihood ratio tests (Hosking, 1984). Following Lechner et al. (1992), he points out that convergence to the Type I distribution can be extremely slow and that the Type III distribution, as a penultimate asymptotic approximation, can in this circumstance give a better fit even for sample sizes as large as one billion. He concludes that, if the data support the case for Type III upper tails, then a positive shape parameter should be fitted. Lechner et al. (1992) found, using CME plots, that of 100 wind records in the US, 61 showed a Type III limiting form, 36 were of Type I, and 3 were Type II. Simiu & Heckert (1996) in a similar study also found for 44 sites in the US that the Type III distribution prevailed. Abild et al. (1992a) compare the results obtained by assuming Type I and Type III distributions of the upper tail. They warn that, because tail behaviour is highly sensitive to variations in k (and see Figure 2), the use of the Type III distribution will not provide reliable estimates when fitted to a short record. This, combined with their concern for the lack of physical justification for the assumption of an upper-bounded Type III distribution, leads them to conclude that a Type I distribution is the preferred choice. 125

8 J P Palutikof, B B Brabson, D H Lister and S T Adcock In the light of these conflicting views, it is difficult to offer advice on the choice of distribution type. It would appear sensible to test for non-zero k using, for example, the methods reviewed by Hosking (1984). It is important to recall, as shown by Figure 2, that the Type III distribution will normally give lower extreme gust or wind speeds for a given return period than the Type I form (Simiu & Scanlan, 1986; Walshaw, 1994). To quote Walshaw (1994), it then becomes essential to make adequate allowance for the margins of error associated with return level estimation. A wide range of goodness-of-fit tests exist to aid model selection. Ross (1987) and Zwiers (1987) use the Anderson Darling statistic, which they prefer over the Kolmogorov Smirnov test (used, e.g., by Quek & Cheong, 1992) because it gives greater weight to the tails. Graph-based tests also exist (see Vogel et al., 1993, for a review). As an example, goodness-of-fit tests based on probability plots were employed by Karim & Chowdhury (1995) in an extremes analysis of hydrological data. They also consider the use of L-moment ratio diagrams. An L-moment ratio diagram is a plot of one moment ratio as a function of another of a lower order. Karim & Chowdhury plot the L-coefficient of skewness as a function of the L-coefficient of variation for observations and for random data generated by candidate distributions. They are able to show, for their particular data set, that the GEV distribution gives the best fit Selection of method for parameter estimation We have described a number of methods for parameter estimation. The characteristics of the data need to be taken into account in the selection of the most appropriate method. Although ML methods introduce negligible bias for sample sizes in the range N = 30 to N = 100, in studies of extreme wind speeds the samples are often smaller, and in this case Abild et al. (1992a) recommend PWM (Hosking & Wallace, 1987; Wang, 1991). In cases where the shape parameter, k, has a nonzero value, PWM will be severely biased for large positive values (k > +0.2). No examples of such large values were found in the literature on wind extremes. Brabson & Palutikof (1998) compare the ML and PWM methods in a GPD analysis of wind speeds and find that the ML method provides much more stable parameter estimates over a range of thresholds. Alternative approaches to parameter estimation exist. Lechner et al. (1993) use simulated wind speed data to compare three methods for estimating the GPD parameters: conditional mean exceedance (CME), Pickands (Pickands, 1975) and Dekkers Einmahl de Haan (DED), which provides a moment-based estimator (Dekkers et al., 1989). They obtain good results for the CME and DED methods and, for wind engineering 126 applications, recommend the DED approach. Simiu & Heckert (1996) use the de Haan (1994) method in their analyses of US wind speeds Data censoring Although techniques such as BLUE seek to reduce the effect of the information in the tails, as a general rule the most extreme observations have a pronounced influence on parameter estimates (Davison & Smith, 1990; Barrett, 1992). This has led to the concept of censorship. POT methods effectively censor the selection of extremes from a data series, by imposing a lower limit below which the extremes cannot fall. No such automatic censorship is imposed in the classical, epochbased, approach. If the maximum value in an epoch is not a true extreme, then its inclusion may bias the estimation of the distribution parameters. A number of authors have practised left-censorship in order to avoid this situation (e.g., Moriarty & Templeton, 1983; Wang, 1991). In a left-censored sample, only the number of extremes below some threshold is taken into account, and not their actual values. This reduces their influence (Buishand, 1989). Right-censorship may be appropriate where there is doubt concerning the accuracy of measurement of the most extreme observations. Barrett (1992) employs this technique for very high flood levels which exceed the capacity of the recording mechanism. In the case of wind extremes, an equivalent situation would exist where the anemometer instrument or the supporting structure failed but it was considered likely, on the basis of evidence from neighbouring recording sites, that a certain upper threshold was exceeded. Rightcensorship would then be an appropriate technique to permit the event to be included in the analysis. The ML method for parameter estimation can easily be modified to handle censored samples (Harter & Moore, 1968; Leese, 1973). Also, the weighted least-squares technique proposed by Harris (1996) for line fitting in a standard Gumbel distribution analysis (see section 2.2) lends itself to right-censorship. The largest value can simply be omitted from the fitting procedure and the weights of the remaining points renormalized to sum to unity Partitioning the data Extreme wind speed predictions are normally made from an analysis of the full set of extremes at a site. For certain studies, however, considering the data set by wind direction (directionality) or by time of year (seasonality) may be useful. First, knowledge of maximum wind speed by direction or by season may in itself be useful. Second, a more accurate estimate of

9 Methods to calculate extreme wind speeds extreme wind speed behaviour may be possible when using a simpler data set with fewer participating meteorological mechanisms. If different meteorological mechanisms are responsible for extremes from different directions or in different seasons (a mixed climate), then partitioning may allow the analysis to focus on a single underlying mechanism at a time. However, where the underlying mechanisms are not conveniently segregated by season or by direction, then it may be necessary to partition on the basis of the mechanisms themselves. There are also disadvantages to using a partitioned data set. Most importantly, the amount of data in any partition is reduced, and therefore the standard errors on quantiles increase. Second, partitioning may introduce unnecessary complexity. Use of the total data set when selecting extremes above a high threshold automatically concentrates the analysis on the upper end of the parent distribution the desired result. In general, partitioning is only necessary or desirable if there is good cause. (a) By wind direction Moriarty & Templeton (1983) suggest that considerable cost savings could be achieved if design engineers were to take into account the directional characteristics of extreme wind speeds. Three wind direction-dependent analyses are described in the literature. Moriarty & Templeton (1983) simply classified the set of maximum daily gust speeds by six directional sectors and performed individual Gumbel analyses on each. In this analysis they identified masking as a potential problem, where the maximum gust speed in one direction is masked by a higher gust speed from a different direction on the same day. They also highlight the difference in the length of the underlying data set of gusts by sector, from which the annual maxima are extracted, as invalidating the underlying assumptions of extreme value analysis. This latter objection is overcome by Coles & Walshaw (1994), who work with resolved wind components, such that each gust has a component of speed in each direction, and a complete sequence of data is available for each direction. However, a further problem is created, in that correlation across directions exists as a result of changes in the wind direction during a storm. This objection must be overcome if, for example, the sectorized data are to be used to derive the joint probabilities of extreme winds in several directions. The theory of max-stable processes (de Haan, 1984) was originally developed to describe the dependence of extremes in geographic space and is adapted by Coles & Walshaw to overcome the problem of correlation across directions within a storm. Cook (1982) used the method of independent storms (MIS) to analyse wind speed extremes by direction, and later addressed the problem of correlation between adjacent directions (Cook, 1983). Although some direction sectors achieve a reduction in design loads using the Cook method when compared to the alldirection values found from a standard annual maxima Gumbel analysis, no significant reductions are achieved for the more common wind directions. (b) By season The terms seasonality and non-stationarity are sometimes used interchangeably in the literature (e.g., Walshaw, 1994). However, here we make a clear distinction between the two. Seasonality we use to describe a seasonal variation in the mixture of underlying meteorological mechanisms and their corresponding wind speed distributions. Non-stationarity, on the other hand, we use to refer to interannual fluctuations due either to natural causes (for example, a trend towards higher wind speeds caused by climate change) or to human intervention or error (change in instrument type, faulty instrumentation or erection of a building close by). In either case, the requirement of extreme value analysis for an identically distributed variable will not be met. Non-stationarity is discussed in section 7.5. Davison & Smith (1990) identify and discuss two basic approaches to the handling of seasonality. The first prewhitens the series by removing known seasonal components prior to extreme value analysis. As examples, Ross (1987) and Zwiers (1987) seasonally adjusted sets of maximum daily wind speed by subtracting an estimate of the annual cycle of the mean, then dividing by an estimate of the annual cycle of the standard deviation. However, Davison & Smith note one potential disadvantage with this method: that the seasonal effects which are identified as affecting the central part of the data may not be valid with respect to the tails. The second approach is a separate seasons strategy, in which a separate GEV or GPD model is fitted for each season. For a GPD analysis, this involves choosing threshold values and minimum separation times for each season. The MIS method of Cook (1982) can also be used successfully in a separate-seasons analysis (Cook, 1983). (c) By mechanism For wind speeds in particular, the different causative mechanisms may not be clearly identified with separate directions or seasons. Twisdale & Vickery (1992) highlight the importance and significance of small but intense thunderstorms as generators of extreme wind speeds in many parts of the non-hurricane regions of the US. They distinguish between the parent distributions of extratropical pressure systems and intense thunderstorms, having analysed the records from four weather stations. They warn against the use of mixed observation series for extreme value analysis 127

10 J P Palutikof, B B Brabson, D H Lister and S T Adcock and advise that data produced by different phenomena should be treated separately to improve predictions. Gomes & Vickery (1978) carried out a similar analysis for Australia, but in this case four extreme windgenerating mechanisms were disaggregated and studied. The combined distribution of these (mutually exclusive) events is given as the sum of the individual risks of exceedance, i.e., for GEV Type I and from equation (1): 1 { 1 exp[ exp( y )] + 1 exp[ exp( y )] +...} ( 20) A where y A and y B etc. are the reduced variates for the data sets related to the different wind mechanisms. B The properties of the GPD distribution can be used to search for a mixture of underlying mechanisms. The shape parameter, k, should be constant with threshold for a homogeneous data set generated by a single mechanism. Thus, if k is found to vary as a function of threshold, this may be indicative of a mixed climate (Brabson & Palutikof, 1998). The CME plot provides a graphical method of observing a change in the contributing GPD distributions with threshold. Changes in the CME slope, k/(1 + k), with threshold value can be indicative of a mixed climate. Figure 3 shows an example CME plot for hourly maximum gust speed data. An obvious slope change at 36 ms 1 corresponds to a change in k Non-stationarity Non-stationarity may be due to natural causes or human error/activity. Some geophysical time series will contain genuine long-term trends which may be expected to persist into the future. A good example is sea level, which at many locations exhibits an increase with time in response to some combination of subsiding land due to isostatic adjustment or human activity and rising absolute sea level due to climate change. For this case, Tawn (1988) presents an extreme value model which allows for a linear trend. Although there is no evidence of long-term linear trends in wind speeds for the British Isles, a number of authors have suggested that the north-eastern Atlantic region experiences low-frequency variability on decadal time scales (Palutikof et al., 1987; WASA group, 1998), possibly linked to the North Atlantic Oscillation (Hurrell, 1995). This has the result that an extreme value analysis based on data from a period of low wind speeds may underestimate the extremes and argues for analyses to be based on long time series for meteorological as well as statistical reasons. Concerning the effects of human activity or error, Zwiers (1987) commented on the dirtiness of long geophysical time series. For wind records in particular, which until relatively recently were compiled from chart recorders or dials, there was a strong subjective element in the measurement of mean wind speeds. The 128 Figure 3. Conditional mean exceedance (CME) plot for hourly maximum gust speeds at Sumburgh (Shetland) for the period , using a 48-hour minimum separation time. determination of gust speeds, on the other hand, is objective. One source of non-stationarity in mean wind speed series is therefore a change in observer. Other causes, for both mean wind speeds and gusts, include a change in instrument type (Smith, 1981), faulty instrumentation, and changes in the landscape around the recording site (Palutikof et al., 1984). Zwiers (1987) found a downward trend in his wind data which he attributed to urbanization effects. These potential sources of error are not unique to extreme value analysis or to wind records. Techniques to assess and, where possible, correct the resulting inhomogeneities are available. A simple time series plot of the data may be sufficient to reveal the presence of trend or step jumps, and is a necessary precursor to further analysis. It may, for example, prove necessary to reject part of the time series in order to create a homogeneous data set for analysis (e.g., Tawn, 1988). 8. Strategies with short data sets For accurate parameter estimation of the GEV distribution and GPD, sample sequences must be sufficiently long. Cook (1985) advises that a minimum of ten years of observation is needed for classical analysis of extreme hourly wind speeds. For r-largest and POT methods, five to six years may be sufficient (e.g., Coles & Walshaw, 1994). For both statistical and meteorological reasons, it is certainly the case that longer records will produce more accurate estimates of wind extremes. Statistically, since the standard error is inversely proportional to the sample size, larger samples imply smaller standard errors. Meteorologically, it is clearly desirable to encompass the full range of variability in extremes and, where lowfrequency variability exists, this may require a long time series. Yet the analyst may have only a few years

11 Methods to calculate extreme wind speeds of data available. To resolve this conflict, which is common in many areas of geophysics, a number of strategies have been developed. They are described briefly below. Of these methods, only those based on a comparison with nearby long-term wind speed records attempt to address the problem of low-frequency variability (e.g., Peterka, 1992; Bocciolone et al., 1993; Abild et al., 1992b) Techniques to lengthen the available dataset The first set of solutions is based on classical methods involving the GEV distribution or GPD, but steps are taken to artificially lengthen the data set available for analysis. Peterka (1992), working with wind data for the midwestern US, formed a superstation with 924 years of data from the detrended records of 29 individual stations. An extremes analysis based on the Gumbel distribution was then performed. Such an approach is only feasible in a climatologically homogeneous area. The detrending was carried out using the mean annual extreme, so that the resulting superstation data may be expected to be homogeneous in the upper tail. The predicted 50-year recurrence wind speed for the superstation was then distributed amongst the original 29 stations by adding back the trend. A comparison with 50-year extreme wind speeds calculated on a station-by-station basis showed considerable differences, which the author ascribes to sampling error from short records. Note, however, that the 924 extremes in the Peterka analysis are not properly independent of one another, since each storm event will have contributed to extreme wind speeds at a number of stations A number of authors have used regression techniques to extend records for extreme value analysis. Bocciolone et al. (1993) required design wind speeds for a site with only six years of observations. They first calculated return-period extremes at neighbouring sites with longer records. These were then adjusted using the ratio of extreme wind speeds (greater than 15 ms 1 ) at the site of interest to the extremes at the long-record sites over the six-year period. The predictor station was selected by inspection of correlation coefficients, and the analysis was performed by direction sectors, with different predictor stations used in different sectors. Unfortunately, the authors do not compare their results with those from, for example, a straightforward POT analysis of the six years of data at the initial site, making it difficult to evaluate the results. Similar techniques have been used by wind farm developers in the UK (Hannah et al., 1996). A method is described in the European Wind Atlas (Troen & Petersen, 1989) to extrapolate wind data from one site to another, principally for the purposes of wind energy estimation. The extrapolation takes into account terrain factors and local siting factors such as roughness and shelter. Abild et al. (1992b) applied this procedure to extrapolate extreme wind speed data between two Danish sites. Cook (1996) develops a similar technique to correct the 50-year extreme wind speed calculated from site observations for topography, roughness length and height above ground. He terms the result the basic wind speed. Grigoriu (1984) and Simiu & Scanlan (1986) enlarged their available data set of extremes for classical analysis based on the Gumbel distribution by selecting the largest value in each month, i.e., using the month as the epoch rather than the year. (Note that this is not the same procedure used to control for seasonality, since the monthly extremes are combined in a single analysis.) This approach is unlikely to yield satisfactory results in a region with a strong seasonal cycle of wind speeds, making it inappropriate for many areas Simulation techniques A number of authors have attempted to extend records of inadequate length by generating series of synthetic data, based on the statistics of the distribution revealed by the real (but short) data sets. The procedure is based on a Markov process, generally a one-step Markov chain model. The observed wind speed data are transformed into a time series of wind speed states, each state containing wind speeds between certain threshold values. This time series is in turn organized into a transitional probability matrix (TPM), which shows the probabilities p ij of a wind speed in state i at hour n changing to any state j at hour n + 1. This TPM, in association with a uniform random number generator, can be used to generate a synthetic time series of wind speed states. This can be converted to a time series of actual wind speed by using a second random number generator, uniform except in the lowest and highest wind speed intervals when, for example, a shifted exponential distribution may be used. The success of the simulation is sensitive to the number of wind speed states selected, which can be best decided by trial runs of the model and comparison of the results with the observed time series. The method is fully described by Kirchoff et al. (1989) and Dukes & Palutikof (1995). In testing the model results, it is common to find that the summary statistics of the original data set, for example the shape and scale parameters of the Weibull distribution, are well modelled. However, the synthetic time series will generally contain less low-frequency information than the original, such that the autocorrelation coefficients for the observed wind remain higher over a longer period than is the case for the synthetic time series (Kirchoff et al., 1989; Kaminsky et al., 1991). This appears to have no serious effect on the estimation of extreme values. Following generation of the synthetic time series, the approach to extracting information on extreme values of wind speed varies. For example, Cheng (1991) and Cheng & Chiu (1990; 1992) analyse the series for its 129

Tests of the Generalized Pareto Distribution for Predicting Extreme Wind Speeds

Tests of the Generalized Pareto Distribution for Predicting Extreme Wind Speeds SEPTEMBER 2000 BRABSON AND PALUTIKOF 1627 Tests of the Generalized Pareto Distribution for Predicting Extreme Wind Speeds B. B. BRABSON Department of Physics, Indiana University, Bloomington, Indiana J.

More information

Extreme wind speed estimation: Issues and improvements

Extreme wind speed estimation: Issues and improvements Extreme wind speed estimation: Issues and improvements Franklin T. Lombardo a, Emil Simiu b a National Institute of Standards and Technology, Gaithersburg, MD, USA, franklin.lombardo@nist.gov b National

More information

How Significant is the BIAS in Low Flow Quantiles Estimated by L- and LH-Moments?

How Significant is the BIAS in Low Flow Quantiles Estimated by L- and LH-Moments? How Significant is the BIAS in Low Flow Quantiles Estimated by L- and LH-Moments? Hewa, G. A. 1, Wang, Q. J. 2, Peel, M. C. 3, McMahon, T. A. 3 and Nathan, R. J. 4 1 University of South Australia, Mawson

More information

Lecture 2 APPLICATION OF EXREME VALUE THEORY TO CLIMATE CHANGE. Rick Katz

Lecture 2 APPLICATION OF EXREME VALUE THEORY TO CLIMATE CHANGE. Rick Katz 1 Lecture 2 APPLICATION OF EXREME VALUE THEORY TO CLIMATE CHANGE Rick Katz Institute for Study of Society and Environment National Center for Atmospheric Research Boulder, CO USA email: rwk@ucar.edu Home

More information

Extreme Value Analysis and Spatial Extremes

Extreme Value Analysis and Spatial Extremes Extreme Value Analysis and Department of Statistics Purdue University 11/07/2013 Outline Motivation 1 Motivation 2 Extreme Value Theorem and 3 Bayesian Hierarchical Models Copula Models Max-stable Models

More information

Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC

Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC EXTREME VALUE THEORY Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 27599-3260 rls@email.unc.edu AMS Committee on Probability and Statistics

More information

Analysing River Discharge Data for Flood Analysis

Analysing River Discharge Data for Flood Analysis Analysing River Discharge Data for Flood Analysis This suggested response provides answers to the eleven questions posed on the web-site. Your report should have covered all of this material. (A) Produce

More information

IT S TIME FOR AN UPDATE EXTREME WAVES AND DIRECTIONAL DISTRIBUTIONS ALONG THE NEW SOUTH WALES COASTLINE

IT S TIME FOR AN UPDATE EXTREME WAVES AND DIRECTIONAL DISTRIBUTIONS ALONG THE NEW SOUTH WALES COASTLINE IT S TIME FOR AN UPDATE EXTREME WAVES AND DIRECTIONAL DISTRIBUTIONS ALONG THE NEW SOUTH WALES COASTLINE M Glatz 1, M Fitzhenry 2, M Kulmar 1 1 Manly Hydraulics Laboratory, Department of Finance, Services

More information

PENULTIMATE APPROXIMATIONS FOR WEATHER AND CLIMATE EXTREMES. Rick Katz

PENULTIMATE APPROXIMATIONS FOR WEATHER AND CLIMATE EXTREMES. Rick Katz PENULTIMATE APPROXIMATIONS FOR WEATHER AND CLIMATE EXTREMES Rick Katz Institute for Mathematics Applied to Geosciences National Center for Atmospheric Research Boulder, CO USA Email: rwk@ucar.edu Web site:

More information

Sharp statistical tools Statistics for extremes

Sharp statistical tools Statistics for extremes Sharp statistical tools Statistics for extremes Georg Lindgren Lund University October 18, 2012 SARMA Background Motivation We want to predict outside the range of observations Sums, averages and proportions

More information

EXTREMAL MODELS AND ENVIRONMENTAL APPLICATIONS. Rick Katz

EXTREMAL MODELS AND ENVIRONMENTAL APPLICATIONS. Rick Katz 1 EXTREMAL MODELS AND ENVIRONMENTAL APPLICATIONS Rick Katz Institute for Study of Society and Environment National Center for Atmospheric Research Boulder, CO USA email: rwk@ucar.edu Home page: www.isse.ucar.edu/hp_rick/

More information

Bayesian Modelling of Extreme Rainfall Data

Bayesian Modelling of Extreme Rainfall Data Bayesian Modelling of Extreme Rainfall Data Elizabeth Smith A thesis submitted for the degree of Doctor of Philosophy at the University of Newcastle upon Tyne September 2005 UNIVERSITY OF NEWCASTLE Bayesian

More information

Representivity of wind measurements for design wind speed estimations

Representivity of wind measurements for design wind speed estimations Representivity of wind measurements for design wind speed estimations Adam Goliger 1, Andries Kruger 2 and Johan Retief 3 1 Built Environment, Council for Scientific and Industrial Research, South Africa.

More information

RESILIENT INFRASTRUCTURE June 1 4, 2016

RESILIENT INFRASTRUCTURE June 1 4, 2016 RESILIENT INFRASTRUCTURE June 1 4, 2016 ANALYSIS OF SPATIALLY-VARYING WIND CHARACTERISTICS FOR DISTRIBUTED SYSTEMS Thomas G. Mara The Boundary Layer Wind Tunnel Laboratory, University of Western Ontario,

More information

LQ-Moments for Statistical Analysis of Extreme Events

LQ-Moments for Statistical Analysis of Extreme Events Journal of Modern Applied Statistical Methods Volume 6 Issue Article 5--007 LQ-Moments for Statistical Analysis of Extreme Events Ani Shabri Universiti Teknologi Malaysia Abdul Aziz Jemain Universiti Kebangsaan

More information

Uncertainties in extreme surge level estimates from observational records

Uncertainties in extreme surge level estimates from observational records Uncertainties in extreme surge level estimates from observational records By H.W. van den Brink, G.P. Können & J.D. Opsteegh Royal Netherlands Meteorological Institute, P.O. Box 21, 373 AE De Bilt, The

More information

Consolidation of analysis methods for sub-annual extreme wind speeds

Consolidation of analysis methods for sub-annual extreme wind speeds METEOROLOGICAL APPLICATIONS Meteorol. Appl. 21: 43 414 (214) Published online 8 January 213 in Wiley Online Library (wileyonlinelibrary.com) DOI: 1.12/met.1355 Consolidation of analysis methods for sub-annual

More information

Estimation of Quantiles

Estimation of Quantiles 9 Estimation of Quantiles The notion of quantiles was introduced in Section 3.2: recall that a quantile x α for an r.v. X is a constant such that P(X x α )=1 α. (9.1) In this chapter we examine quantiles

More information

Efficient Estimation of Distributional Tail Shape and the Extremal Index with Applications to Risk Management

Efficient Estimation of Distributional Tail Shape and the Extremal Index with Applications to Risk Management Journal of Mathematical Finance, 2016, 6, 626-659 http://www.scirp.org/journal/jmf ISSN Online: 2162-2442 ISSN Print: 2162-2434 Efficient Estimation of Distributional Tail Shape and the Extremal Index

More information

Extreme Precipitation: An Application Modeling N-Year Return Levels at the Station Level

Extreme Precipitation: An Application Modeling N-Year Return Levels at the Station Level Extreme Precipitation: An Application Modeling N-Year Return Levels at the Station Level Presented by: Elizabeth Shamseldin Joint work with: Richard Smith, Doug Nychka, Steve Sain, Dan Cooley Statistics

More information

Glossary. The ISI glossary of statistical terms provides definitions in a number of different languages:

Glossary. The ISI glossary of statistical terms provides definitions in a number of different languages: Glossary The ISI glossary of statistical terms provides definitions in a number of different languages: http://isi.cbs.nl/glossary/index.htm Adjusted r 2 Adjusted R squared measures the proportion of the

More information

Frequency Estimation of Rare Events by Adaptive Thresholding

Frequency Estimation of Rare Events by Adaptive Thresholding Frequency Estimation of Rare Events by Adaptive Thresholding J. R. M. Hosking IBM Research Division 2009 IBM Corporation Motivation IBM Research When managing IT systems, there is a need to identify transactions

More information

Overview of Extreme Value Analysis (EVA)

Overview of Extreme Value Analysis (EVA) Overview of Extreme Value Analysis (EVA) Brian Reich North Carolina State University July 26, 2016 Rossbypalooza Chicago, IL Brian Reich Overview of Extreme Value Analysis (EVA) 1 / 24 Importance of extremes

More information

STAT 6350 Analysis of Lifetime Data. Probability Plotting

STAT 6350 Analysis of Lifetime Data. Probability Plotting STAT 6350 Analysis of Lifetime Data Probability Plotting Purpose of Probability Plots Probability plots are an important tool for analyzing data and have been particular popular in the analysis of life

More information

The Model Building Process Part I: Checking Model Assumptions Best Practice

The Model Building Process Part I: Checking Model Assumptions Best Practice The Model Building Process Part I: Checking Model Assumptions Best Practice Authored by: Sarah Burke, PhD 31 July 2017 The goal of the STAT T&E COE is to assist in developing rigorous, defensible test

More information

R.Garçon, F.Garavaglia, J.Gailhard, E.Paquet, F.Gottardi EDF-DTG

R.Garçon, F.Garavaglia, J.Gailhard, E.Paquet, F.Gottardi EDF-DTG Homogeneous samples and reliability of probabilistic models : using an atmospheric circulation patterns sampling for a better estimation of extreme rainfall probability R.Garçon, F.Garavaglia, J.Gailhard,

More information

The Model Building Process Part I: Checking Model Assumptions Best Practice (Version 1.1)

The Model Building Process Part I: Checking Model Assumptions Best Practice (Version 1.1) The Model Building Process Part I: Checking Model Assumptions Best Practice (Version 1.1) Authored by: Sarah Burke, PhD Version 1: 31 July 2017 Version 1.1: 24 October 2017 The goal of the STAT T&E COE

More information

Bayesian Inference for Clustered Extremes

Bayesian Inference for Clustered Extremes Newcastle University, Newcastle-upon-Tyne, U.K. lee.fawcett@ncl.ac.uk 20th TIES Conference: Bologna, Italy, July 2009 Structure of this talk 1. Motivation and background 2. Review of existing methods Limitations/difficulties

More information

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -27 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -27 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY Lecture -27 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. Summary of the previous lecture Frequency factors Normal distribution

More information

HIERARCHICAL MODELS IN EXTREME VALUE THEORY

HIERARCHICAL MODELS IN EXTREME VALUE THEORY HIERARCHICAL MODELS IN EXTREME VALUE THEORY Richard L. Smith Department of Statistics and Operations Research, University of North Carolina, Chapel Hill and Statistical and Applied Mathematical Sciences

More information

Classical Extreme Value Theory - An Introduction

Classical Extreme Value Theory - An Introduction Chapter 1 Classical Extreme Value Theory - An Introduction 1.1 Introduction Asymptotic theory of functions of random variables plays a very important role in modern statistics. The objective of the asymptotic

More information

Fitting the generalized Pareto distribution to data using maximum goodness-of-fit estimators

Fitting the generalized Pareto distribution to data using maximum goodness-of-fit estimators Computational Statistics & Data Analysis 51 (26) 94 917 www.elsevier.com/locate/csda Fitting the generalized Pareto distribution to data using maximum goodness-of-fit estimators Alberto Luceño E.T.S. de

More information

A class of probability distributions for application to non-negative annual maxima

A class of probability distributions for application to non-negative annual maxima Hydrol. Earth Syst. Sci. Discuss., doi:.94/hess-7-98, 7 A class of probability distributions for application to non-negative annual maxima Earl Bardsley School of Science, University of Waikato, Hamilton

More information

Threshold estimation in marginal modelling of spatially-dependent non-stationary extremes

Threshold estimation in marginal modelling of spatially-dependent non-stationary extremes Threshold estimation in marginal modelling of spatially-dependent non-stationary extremes Philip Jonathan Shell Technology Centre Thornton, Chester philip.jonathan@shell.com Paul Northrop University College

More information

Zwiers FW and Kharin VV Changes in the extremes of the climate simulated by CCC GCM2 under CO 2 doubling. J. Climate 11:

Zwiers FW and Kharin VV Changes in the extremes of the climate simulated by CCC GCM2 under CO 2 doubling. J. Climate 11: Statistical Analysis of EXTREMES in GEOPHYSICS Zwiers FW and Kharin VV. 1998. Changes in the extremes of the climate simulated by CCC GCM2 under CO 2 doubling. J. Climate 11:2200 2222. http://www.ral.ucar.edu/staff/ericg/readinggroup.html

More information

Journal of Environmental Statistics

Journal of Environmental Statistics jes Journal of Environmental Statistics February 2010, Volume 1, Issue 3. http://www.jenvstat.org Exponentiated Gumbel Distribution for Estimation of Return Levels of Significant Wave Height Klara Persson

More information

Load-Strength Interference

Load-Strength Interference Load-Strength Interference Loads vary, strengths vary, and reliability usually declines for mechanical systems, electronic systems, and electrical systems. The cause of failures is a load-strength interference

More information

Peaks-Over-Threshold Modelling of Environmental Data

Peaks-Over-Threshold Modelling of Environmental Data U.U.D.M. Project Report 2014:33 Peaks-Over-Threshold Modelling of Environmental Data Esther Bommier Examensarbete i matematik, 30 hp Handledare och examinator: Jesper Rydén September 2014 Department of

More information

Stochastic Hydrology. a) Data Mining for Evolution of Association Rules for Droughts and Floods in India using Climate Inputs

Stochastic Hydrology. a) Data Mining for Evolution of Association Rules for Droughts and Floods in India using Climate Inputs Stochastic Hydrology a) Data Mining for Evolution of Association Rules for Droughts and Floods in India using Climate Inputs An accurate prediction of extreme rainfall events can significantly aid in policy

More information

Regional Estimation from Spatially Dependent Data

Regional Estimation from Spatially Dependent Data Regional Estimation from Spatially Dependent Data R.L. Smith Department of Statistics University of North Carolina Chapel Hill, NC 27599-3260, USA December 4 1990 Summary Regional estimation methods are

More information

KNMI-HYDRA project. Phase report 7. Estimation of extreme return levels of wind speed: an analysis of storm maxima

KNMI-HYDRA project. Phase report 7. Estimation of extreme return levels of wind speed: an analysis of storm maxima KNMI-HYDRA project Phase report 7 Estimation of extreme return levels of wind speed: an analysis of storm maxima KNMI, May 03 Estimation of extreme return levels of wind speed: an analysis of storm maxima

More information

On the modelling of extreme droughts

On the modelling of extreme droughts Modelling and Management of Sustainable Basin-scale Water Resource Systems (Proceedings of a Boulder Symposium, July 1995). IAHS Publ. no. 231, 1995. 377 _ On the modelling of extreme droughts HENRIK MADSEN

More information

MAXIMUM WIND GUST RETURN PERIODS FOR OKLAHOMA USING THE OKLAHOMA MESONET. Andrew J. Reader Oklahoma Climatological Survey, Norman, OK. 2.

MAXIMUM WIND GUST RETURN PERIODS FOR OKLAHOMA USING THE OKLAHOMA MESONET. Andrew J. Reader Oklahoma Climatological Survey, Norman, OK. 2. J3.14 MAXIMUM WIND GUST RETURN PERIODS FOR OKLAHOMA USING THE OKLAHOMA MESONET Andrew J. Reader Oklahoma Climatological Survey, Norman, OK 1. Introduction It is well known that Oklahoma experiences convective

More information

Enhancing Weather Information with Probability Forecasts. An Information Statement of the American Meteorological Society

Enhancing Weather Information with Probability Forecasts. An Information Statement of the American Meteorological Society Enhancing Weather Information with Probability Forecasts An Information Statement of the American Meteorological Society (Adopted by AMS Council on 12 May 2008) Bull. Amer. Meteor. Soc., 89 Summary This

More information

ON THE TWO STEP THRESHOLD SELECTION FOR OVER-THRESHOLD MODELLING

ON THE TWO STEP THRESHOLD SELECTION FOR OVER-THRESHOLD MODELLING ON THE TWO STEP THRESHOLD SELECTION FOR OVER-THRESHOLD MODELLING Pietro Bernardara (1,2), Franck Mazas (3), Jérôme Weiss (1,2), Marc Andreewsky (1), Xavier Kergadallan (4), Michel Benoît (1,2), Luc Hamm

More information

The battle of extreme value distributions: A global survey on the extreme

The battle of extreme value distributions: A global survey on the extreme 1 2 The battle of extreme value distributions: A global survey on the extreme daily rainfall 3 Simon Michael Papalexiou and Demetris Koutsoyiannis 4 5 Department of Water Resources, Faculty of Civil Engineering,

More information

Colin C. Caprani. University College Dublin, Ireland

Colin C. Caprani. University College Dublin, Ireland Colin C. Caprani University College Dublin, Ireland Bending Moment (knm) 8 7 6 5 4 3 2 2 3 4 5 6 Position of First Axle (m).2 GVW - Direction..8.6 Measured Generated Modelled.4.2 2 3 4 5 Probabalistic

More information

Financial Econometrics and Volatility Models Extreme Value Theory

Financial Econometrics and Volatility Models Extreme Value Theory Financial Econometrics and Volatility Models Extreme Value Theory Eric Zivot May 3, 2010 1 Lecture Outline Modeling Maxima and Worst Cases The Generalized Extreme Value Distribution Modeling Extremes Over

More information

' International Institute for Land Reclamation and Improvement. 6.1 Introduction. 6.2 Frequency Analysis

' International Institute for Land Reclamation and Improvement. 6.1 Introduction. 6.2 Frequency Analysis 6.1 Introduction Frequency analysis, regression analysis, and screening of time series are the most common statistical methods of analyzing hydrologic data. Frequency analysis is used to predict how often

More information

Construction of confidence intervals for extreme rainfall quantiles

Construction of confidence intervals for extreme rainfall quantiles Risk Analysis VIII 93 Construction of confidence intervals for extreme rainfall quantiles A. T. Silva 1, M. M. Portela 1, J. Baez & M. Naghettini 3 1 Instituto Superior Técnico, Portugal Universidad Católica

More information

Quantifying Weather Risk Analysis

Quantifying Weather Risk Analysis Quantifying Weather Risk Analysis Now that an index has been selected and calibrated, it can be used to conduct a more thorough risk analysis. The objective of such a risk analysis is to gain a better

More information

1 Degree distributions and data

1 Degree distributions and data 1 Degree distributions and data A great deal of effort is often spent trying to identify what functional form best describes the degree distribution of a network, particularly the upper tail of that distribution.

More information

Subject Index. Block maxima, 3 Bootstrap, 45, 67, 149, 176 Box-Cox transformation, 71, 85 Brownian noise, 219

Subject Index. Block maxima, 3 Bootstrap, 45, 67, 149, 176 Box-Cox transformation, 71, 85 Brownian noise, 219 Subject Index Entries in this index are generally sorted with page number as they appear in the text. Page numbers that are marked in bold face indicate that the entry appears in a title or subheading.

More information

Spatial and temporal extremes of wildfire sizes in Portugal ( )

Spatial and temporal extremes of wildfire sizes in Portugal ( ) International Journal of Wildland Fire 2009, 18, 983 991. doi:10.1071/wf07044_ac Accessory publication Spatial and temporal extremes of wildfire sizes in Portugal (1984 2004) P. de Zea Bermudez A, J. Mendes

More information

Climate Risk Profile for Samoa

Climate Risk Profile for Samoa Climate Risk Profile for Samoa Report Prepared by Wairarapa J. Young Samoa Meteorology Division March, 27 Summary The likelihood (i.e. probability) components of climate-related risks in Samoa are evaluated

More information

Precipitation Extremes in the Hawaiian Islands and Taiwan under a changing climate

Precipitation Extremes in the Hawaiian Islands and Taiwan under a changing climate Precipitation Extremes in the Hawaiian Islands and Taiwan under a changing climate Pao-Shin Chu Department of Atmospheric Sciences University of Hawaii-Manoa Y. Ruan, X. Zhao, D.J. Chen, and P.L. Lin December

More information

Introduction to Algorithmic Trading Strategies Lecture 10

Introduction to Algorithmic Trading Strategies Lecture 10 Introduction to Algorithmic Trading Strategies Lecture 10 Risk Management Haksun Li haksun.li@numericalmethod.com www.numericalmethod.com Outline Value at Risk (VaR) Extreme Value Theory (EVT) References

More information

Investigation of an Automated Approach to Threshold Selection for Generalized Pareto

Investigation of an Automated Approach to Threshold Selection for Generalized Pareto Investigation of an Automated Approach to Threshold Selection for Generalized Pareto Kate R. Saunders Supervisors: Peter Taylor & David Karoly University of Melbourne April 8, 2015 Outline 1 Extreme Value

More information

The Spatial Variation of the Maximum Possible Pollutant Concentration from Steady Sources

The Spatial Variation of the Maximum Possible Pollutant Concentration from Steady Sources International Environmental Modelling and Software Society (iemss) 2010 International Congress on Environmental Modelling and Software Modelling for Environment s Sake, Fifth Biennial Meeting, Ottawa,

More information

PREDICTING DROUGHT VULNERABILITY IN THE MEDITERRANEAN

PREDICTING DROUGHT VULNERABILITY IN THE MEDITERRANEAN J.7 PREDICTING DROUGHT VULNERABILITY IN THE MEDITERRANEAN J. P. Palutikof and T. Holt Climatic Research Unit, University of East Anglia, Norwich, UK. INTRODUCTION Mediterranean water resources are under

More information

Descripiton of method used for wind estimates in StormGeo

Descripiton of method used for wind estimates in StormGeo Descripiton of method used for wind estimates in StormGeo Photo taken from http://juzzi-juzzi.deviantart.com/art/kite-169538553 StormGeo, 2012.10.16 Introduction The wind studies made by StormGeo for HybridTech

More information

The Goodness-of-fit Test for Gumbel Distribution: A Comparative Study

The Goodness-of-fit Test for Gumbel Distribution: A Comparative Study MATEMATIKA, 2012, Volume 28, Number 1, 35 48 c Department of Mathematics, UTM. The Goodness-of-fit Test for Gumbel Distribution: A Comparative Study 1 Nahdiya Zainal Abidin, 2 Mohd Bakri Adam and 3 Habshah

More information

The Australian Operational Daily Rain Gauge Analysis

The Australian Operational Daily Rain Gauge Analysis The Australian Operational Daily Rain Gauge Analysis Beth Ebert and Gary Weymouth Bureau of Meteorology Research Centre, Melbourne, Australia e.ebert@bom.gov.au Daily rainfall data and analysis procedure

More information

Reprinted from MONTHLY WEATHER REVIEW, Vol. 109, No. 12, December 1981 American Meteorological Society Printed in I'. S. A.

Reprinted from MONTHLY WEATHER REVIEW, Vol. 109, No. 12, December 1981 American Meteorological Society Printed in I'. S. A. Reprinted from MONTHLY WEATHER REVIEW, Vol. 109, No. 12, December 1981 American Meteorological Society Printed in I'. S. A. Fitting Daily Precipitation Amounts Using the S B Distribution LLOYD W. SWIFT,

More information

On the Application of the Generalized Pareto Distribution for Statistical Extrapolation in the Assessment of Dynamic Stability in Irregular Waves

On the Application of the Generalized Pareto Distribution for Statistical Extrapolation in the Assessment of Dynamic Stability in Irregular Waves On the Application of the Generalized Pareto Distribution for Statistical Extrapolation in the Assessment of Dynamic Stability in Irregular Waves Bradley Campbell 1, Vadim Belenky 1, Vladas Pipiras 2 1.

More information

Overview of Extreme Value Theory. Dr. Sawsan Hilal space

Overview of Extreme Value Theory. Dr. Sawsan Hilal space Overview of Extreme Value Theory Dr. Sawsan Hilal space Maths Department - University of Bahrain space November 2010 Outline Part-1: Univariate Extremes Motivation Threshold Exceedances Part-2: Bivariate

More information

Wei-han Liu Department of Banking and Finance Tamkang University. R/Finance 2009 Conference 1

Wei-han Liu Department of Banking and Finance Tamkang University. R/Finance 2009 Conference 1 Detecting Structural Breaks in Tail Behavior -From the Perspective of Fitting the Generalized Pareto Distribution Wei-han Liu Department of Banking and Finance Tamkang University R/Finance 2009 Conference

More information

On the occurrence times of componentwise maxima and bias in likelihood inference for multivariate max-stable distributions

On the occurrence times of componentwise maxima and bias in likelihood inference for multivariate max-stable distributions On the occurrence times of componentwise maxima and bias in likelihood inference for multivariate max-stable distributions J. L. Wadsworth Department of Mathematics and Statistics, Fylde College, Lancaster

More information

High-frequency data modelling using Hawkes processes

High-frequency data modelling using Hawkes processes Valérie Chavez-Demoulin joint work with High-frequency A.C. Davison data modelling and using A.J. Hawkes McNeil processes(2005), J.A EVT2013 McGill 1 /(201 High-frequency data modelling using Hawkes processes

More information

APPLICATION OF EXTREMAL THEORY TO THE PRECIPITATION SERIES IN NORTHERN MORAVIA

APPLICATION OF EXTREMAL THEORY TO THE PRECIPITATION SERIES IN NORTHERN MORAVIA APPLICATION OF EXTREMAL THEORY TO THE PRECIPITATION SERIES IN NORTHERN MORAVIA DANIELA JARUŠKOVÁ Department of Mathematics, Czech Technical University, Prague; jarus@mat.fsv.cvut.cz 1. Introduction The

More information

Extreme Rain all Frequency Analysis for Louisiana

Extreme Rain all Frequency Analysis for Louisiana 78 TRANSPORTATION RESEARCH RECORD 1420 Extreme Rain all Frequency Analysis for Louisiana BABAK NAGHAVI AND FANG XIN Yu A comparative study of five popular frequency distributions and three parameter estimation

More information

Physics and Chemistry of the Earth

Physics and Chemistry of the Earth Physics and Chemistry of the Earth 34 (2009) 626 634 Contents lists available at ScienceDirect Physics and Chemistry of the Earth journal homepage: www.elsevier.com/locate/pce Performances of some parameter

More information

Hydrologic Design under Nonstationarity

Hydrologic Design under Nonstationarity Hydrologic Design under Nonstationarity Jayantha Obeysekera ( Obey ), SFWMD Jose D. Salas, Colorado State University Hydroclimatology & Engineering Adaptaton (HYDEA) Subcommittee Meeting May 30, 2017,

More information

High-frequency data modelling using Hawkes processes

High-frequency data modelling using Hawkes processes High-frequency data modelling using Hawkes processes Valérie Chavez-Demoulin 1 joint work J.A McGill 1 Faculty of Business and Economics, University of Lausanne, Switzerland Boulder, April 2016 Boulder,

More information

FOWPI Metocean Workshop Modelling, Design Parameters and Weather Windows

FOWPI Metocean Workshop Modelling, Design Parameters and Weather Windows FOWPI Metocean Workshop Modelling, Design Parameters and Weather Windows Jesper Skourup, Chief Specialist, COWI 1 The Project is funded by The European Union Agenda 1. Metocean Data Requirements 2. Site

More information

MULTIDIMENSIONAL COVARIATE EFFECTS IN SPATIAL AND JOINT EXTREMES

MULTIDIMENSIONAL COVARIATE EFFECTS IN SPATIAL AND JOINT EXTREMES MULTIDIMENSIONAL COVARIATE EFFECTS IN SPATIAL AND JOINT EXTREMES Philip Jonathan, Kevin Ewans, David Randell, Yanyun Wu philip.jonathan@shell.com www.lancs.ac.uk/ jonathan Wave Hindcasting & Forecasting

More information

Appendix 1: UK climate projections

Appendix 1: UK climate projections Appendix 1: UK climate projections The UK Climate Projections 2009 provide the most up-to-date estimates of how the climate may change over the next 100 years. They are an invaluable source of information

More information

11B.1 OPTIMAL APPLICATION OF CLIMATE DATA TO THE DEVELOPMENT OF DESIGN WIND SPEEDS

11B.1 OPTIMAL APPLICATION OF CLIMATE DATA TO THE DEVELOPMENT OF DESIGN WIND SPEEDS 11B.1 OPTIMAL APPLICATION OF CLIMATE DATA TO THE DEVELOPMENT OF DESIGN WIND SPEEDS Andries C. Kruger * South African Weather Service, Pretoria, South Africa Johan V. Retief University of Stellenbosch,

More information

Modelling residual wind farm variability using HMMs

Modelling residual wind farm variability using HMMs 8 th World IMACS/MODSIM Congress, Cairns, Australia 3-7 July 2009 http://mssanz.org.au/modsim09 Modelling residual wind farm variability using HMMs Ward, K., Korolkiewicz, M. and Boland, J. School of Mathematics

More information

INCORPORATION OF WEIBULL DISTRIBUTION IN L-MOMENTS METHOD FOR REGIONAL FREQUENCY ANALYSIS OF PEAKS-OVER-THRESHOLD WAVE HEIGHTS

INCORPORATION OF WEIBULL DISTRIBUTION IN L-MOMENTS METHOD FOR REGIONAL FREQUENCY ANALYSIS OF PEAKS-OVER-THRESHOLD WAVE HEIGHTS INCORPORATION OF WEIBULL DISTRIBUTION IN L-MOMENTS METHOD FOR REGIONAL FREQUENCY ANALYSIS OF PEAKS-OVER-THRESHOLD WAVE HEIGHTS Yoshimi Goda, Masanobu Kudaa, and Hiroyasu Kawai The L-moments of the distribution

More information

New Intensity-Frequency- Duration (IFD) Design Rainfalls Estimates

New Intensity-Frequency- Duration (IFD) Design Rainfalls Estimates New Intensity-Frequency- Duration (IFD) Design Rainfalls Estimates Janice Green Bureau of Meteorology 17 April 2013 Current IFDs AR&R87 Current IFDs AR&R87 Current IFDs - AR&R87 Options for estimating

More information

Practice Problems Section Problems

Practice Problems Section Problems Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,

More information

Large-scale Indicators for Severe Weather

Large-scale Indicators for Severe Weather Large-scale Indicators for Severe Weather Eric Gilleland Matthew Pocernich Harold E. Brooks Barbara G. Brown Patrick Marsh Abstract Trends in extreme values of a large-scale indicator for severe weather

More information

TREND AND VARIABILITY ANALYSIS OF RAINFALL SERIES AND THEIR EXTREME

TREND AND VARIABILITY ANALYSIS OF RAINFALL SERIES AND THEIR EXTREME TREND AND VARIABILITY ANALYSIS OF RAINFALL SERIES AND THEIR EXTREME EVENTS J. Abaurrea, A. C. Cebrián. Dpto. Métodos Estadísticos. Universidad de Zaragoza. Abstract: Rainfall series and their corresponding

More information

1 Introduction. 2 AIC versus SBIC. Erik Swanson Cori Saviano Li Zha Final Project

1 Introduction. 2 AIC versus SBIC. Erik Swanson Cori Saviano Li Zha Final Project Erik Swanson Cori Saviano Li Zha Final Project 1 Introduction In analyzing time series data, we are posed with the question of how past events influences the current situation. In order to determine this,

More information

401 Review. 6. Power analysis for one/two-sample hypothesis tests and for correlation analysis.

401 Review. 6. Power analysis for one/two-sample hypothesis tests and for correlation analysis. 401 Review Major topics of the course 1. Univariate analysis 2. Bivariate analysis 3. Simple linear regression 4. Linear algebra 5. Multiple regression analysis Major analysis methods 1. Graphical analysis

More information

Systematic errors and time dependence in rainfall annual maxima statistics in Lombardy

Systematic errors and time dependence in rainfall annual maxima statistics in Lombardy Systematic errors and time dependence in rainfall annual maxima statistics in Lombardy F. Uboldi (1), A. N. Sulis (2), M. Cislaghi (2), C. Lussana (2,3), and M. Russo (2) (1) consultant, Milano, Italy

More information

Payer, Küchenhoff: Modelling extreme wind speeds in the context of risk analysis for high speed trains

Payer, Küchenhoff: Modelling extreme wind speeds in the context of risk analysis for high speed trains Payer, Küchenhoff: Modelling extreme wind speeds in the context of risk analysis for high speed trains Sonderforschungsbereich 386, Paper 295 (2002) Online unter: http://epub.ub.uni-muenchen.de/ Projektpartner

More information

Probability Methods in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

Probability Methods in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur Probability Methods in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur Lecture No. # 33 Probability Models using Gamma and Extreme Value

More information

Abstract: In this short note, I comment on the research of Pisarenko et al. (2014) regarding the

Abstract: In this short note, I comment on the research of Pisarenko et al. (2014) regarding the Comment on Pisarenko et al. Characterization of the Tail of the Distribution of Earthquake Magnitudes by Combining the GEV and GPD Descriptions of Extreme Value Theory Mathias Raschke Institution: freelancer

More information

PAijpam.eu ADAPTIVE K-S TESTS FOR WHITE NOISE IN THE FREQUENCY DOMAIN Hossein Arsham University of Baltimore Baltimore, MD 21201, USA

PAijpam.eu ADAPTIVE K-S TESTS FOR WHITE NOISE IN THE FREQUENCY DOMAIN Hossein Arsham University of Baltimore Baltimore, MD 21201, USA International Journal of Pure and Applied Mathematics Volume 82 No. 4 2013, 521-529 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu doi: http://dx.doi.org/10.12732/ijpam.v82i4.2

More information

Research Article A Nonparametric Two-Sample Wald Test of Equality of Variances

Research Article A Nonparametric Two-Sample Wald Test of Equality of Variances Advances in Decision Sciences Volume 211, Article ID 74858, 8 pages doi:1.1155/211/74858 Research Article A Nonparametric Two-Sample Wald Test of Equality of Variances David Allingham 1 andj.c.w.rayner

More information

Battle of extreme value distributions: A global survey on extreme daily rainfall

Battle of extreme value distributions: A global survey on extreme daily rainfall WATER RESOURCES RESEARCH, VOL. 49, 187 201, doi:10.1029/2012wr012557, 2013 Battle of extreme value distributions: A global survey on extreme daily rainfall Simon Michael Papalexiou, 1 and Demetris Koutsoyiannis

More information

Extreme Precipitation Analysis at Hinkley Point Final Report

Extreme Precipitation Analysis at Hinkley Point Final Report Extreme Precipitation Analysis at Hinkley Point Final Report For - EDF Date 20 th August 2010 Authors Thomas Francis, Michael Sanderson, James Dent and Mathew Perry FinalReport_1and2_v6.2-1 Crown copyright

More information

Modeling and Performance Analysis with Discrete-Event Simulation

Modeling and Performance Analysis with Discrete-Event Simulation Simulation Modeling and Performance Analysis with Discrete-Event Simulation Chapter 9 Input Modeling Contents Data Collection Identifying the Distribution with Data Parameter Estimation Goodness-of-Fit

More information

C4-304 STATISTICS OF LIGHTNING OCCURRENCE AND LIGHTNING CURRENT S PARAMETERS OBTAINED THROUGH LIGHTNING LOCATION SYSTEMS

C4-304 STATISTICS OF LIGHTNING OCCURRENCE AND LIGHTNING CURRENT S PARAMETERS OBTAINED THROUGH LIGHTNING LOCATION SYSTEMS 2, rue d'artois, F-75008 Paris http://www.cigre.org C4-304 Session 2004 CIGRÉ STATISTICS OF LIGHTNING OCCURRENCE AND LIGHTNING CURRENT S PARAMETERS OBTAINED THROUGH LIGHTNING LOCATION SYSTEMS baran@el.poweng.pub.ro

More information

Bayesian Point Process Modeling for Extreme Value Analysis, with an Application to Systemic Risk Assessment in Correlated Financial Markets

Bayesian Point Process Modeling for Extreme Value Analysis, with an Application to Systemic Risk Assessment in Correlated Financial Markets Bayesian Point Process Modeling for Extreme Value Analysis, with an Application to Systemic Risk Assessment in Correlated Financial Markets Athanasios Kottas Department of Applied Mathematics and Statistics,

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 8 Input Modeling Purpose & Overview Input models provide the driving force for a simulation model. The quality of the output is no better than the quality

More information

HANDBOOK OF APPLICABLE MATHEMATICS

HANDBOOK OF APPLICABLE MATHEMATICS HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume VI: Statistics PART A Edited by Emlyn Lloyd University of Lancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester

More information

MFM Practitioner Module: Quantitiative Risk Management. John Dodson. October 14, 2015

MFM Practitioner Module: Quantitiative Risk Management. John Dodson. October 14, 2015 MFM Practitioner Module: Quantitiative Risk Management October 14, 2015 The n-block maxima 1 is a random variable defined as M n max (X 1,..., X n ) for i.i.d. random variables X i with distribution function

More information