International Conference on Applied Probability and Computational Methods in Applied Sciences

Similar documents
Ordinal optimization - Empirical large deviations rate estimators, and multi-armed bandit methods

Extremogram and ex-periodogram for heavy-tailed time series

Regular Variation and Extreme Events for Stochastic Processes

Extremogram and Ex-Periodogram for heavy-tailed time series

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case

Towards inference for skewed alpha stable Levy processes

Spatial extreme value theory and properties of max-stable processes Poitiers, November 8-10, 2012

Introduction to Algorithmic Trading Strategies Lecture 10

STA 4273H: Statistical Machine Learning

Nonparametric Bayesian Methods (Gaussian Processes)

Bayesian Methods for Machine Learning

STA 4273H: Statistical Machine Learning

Extreme Value Analysis and Spatial Extremes

Introduction to Rare Event Simulation

Asymptotics and Simulation of Heavy-Tailed Processes

GARCH processes probabilistic properties (Part 1)

Time Series: Theory and Methods

Extremes and ruin of Gaussian processes

Heavy Tailed Time Series with Extremal Independence

PROBABILITY DISTRIBUTIONS. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception

Kyoto (Fushimi-Uji) International Seminar

Markov chain Monte Carlo methods for visual tracking

Hakone Seminar Recent Developments in Statistics

Ordinal Optimization and Multi Armed Bandit Techniques

Risk Aggregation with Dependence Uncertainty

Gaussian Random Fields: Excursion Probabilities

Thomas J. Fisher. Research Statement. Preliminary Results

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

Waseda International Symposium on Stable Process, Semimartingale, Finance & Pension Mathematics

Computational statistics

Workshop Copulas and Extremes

Nonlinear Time Series Modeling

E cient Monte Carlo for Gaussian Fields and Processes

Overview of Extreme Value Theory. Dr. Sawsan Hilal space

Importance Sampling Methodology for Multidimensional Heavy-tailed Random Walks

Practical conditions on Markov chains for weak convergence of tail empirical processes

EFFICIENT TAIL ESTIMATION FOR SUMS OF CORRELATED LOGNORMALS. Leonardo Rojas-Nandayapa Department of Mathematical Sciences

Limit Theory for Spatial Processes, Bootstrap Quantile Variance Estimators, and Efficiency Measures for Markov Chain Monte Carlo

Lecture 2 APPLICATION OF EXREME VALUE THEORY TO CLIMATE CHANGE. Rick Katz

Statistical Inference and Methods

On Kesten s counterexample to the Cramér-Wold device for regular variation

Asymptotic behaviour of multivariate default probabilities and default correlations under stress

On Backtesting Risk Measurement Models

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

STA 4273H: Statistical Machine Learning

Stochastic Models. Edited by D.P. Heyman Bellcore. MJ. Sobel State University of New York at Stony Brook

Beyond the color of the noise: what is memory in random phenomena?

Monte Carlo Methods. Handbook of. University ofqueensland. Thomas Taimre. Zdravko I. Botev. Dirk P. Kroese. Universite de Montreal

Statistics & Data Sciences: First Year Prelim Exam May 2018

Stat 5101 Lecture Notes

Asymptotic inference for a nonstationary double ar(1) model

Stochastic Processes

REPORT DOCUMENTATION PAGE

Bayesian Nonparametrics

EXTREMAL QUANTILES OF MAXIMUMS FOR STATIONARY SEQUENCES WITH PSEUDO-STATIONARY TREND WITH APPLICATIONS IN ELECTRICITY CONSUMPTION ALEXANDR V.

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications

Probabilistic Graphical Models

Monte Carlo Integration using Importance Sampling and Gibbs Sampling

Quantifying Stochastic Model Errors via Robust Optimization

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści

Dr. Arno Berger School of Mathematics

Session 5B: A worked example EGARCH model

Principles of Bayesian Inference

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models

A Semi-parametric Bayesian Framework for Performance Analysis of Call Centers

CPSC 540: Machine Learning

ABC methods for phase-type distributions with applications in insurance risk problems

Principles of Bayesian Inference

On Bayesian Computation

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS*

FRACTIONAL BROWNIAN MOTION WITH H < 1/2 AS A LIMIT OF SCHEDULED TRAFFIC

Multimodal Nested Sampling

SPECIFICATION TESTS IN PARAMETRIC VALUE-AT-RISK MODELS

Assessing the dependence of high-dimensional time series via sample autocovariances and correlations

Asymptotics and simulation for heavy-tailed processes

Long-Run Covariability

robustness, efficiency, breakdown point, outliers, rank-based procedures, least absolute regression

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds.

Principles of Bayesian Inference

Nonstationary spatial process modeling Part II Paul D. Sampson --- Catherine Calder Univ of Washington --- Ohio State University

Modelling Operational Risk Using Bayesian Inference

Gentle Introduction to Infinite Gaussian Mixture Modeling

A MODIFICATION OF HILL S TAIL INDEX ESTIMATOR

Master of Science in Statistics A Proposal

Brief introduction to Markov Chain Monte Carlo

Prerequisite: STATS 7 or STATS 8 or AP90 or (STATS 120A and STATS 120B and STATS 120C). AP90 with a minimum score of 3

Random Matrix Theory Lecture 1 Introduction, Ensembles and Basic Laws. Symeon Chatzinotas February 11, 2013 Luxembourg

6 Markov Chain Monte Carlo (MCMC)

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

Econ 424 Time Series Concepts

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes

Sequential Monte Carlo Methods for Bayesian Computation

Part IV: Monte Carlo and nonparametric Bayes

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

* Tuesday 17 January :30-16:30 (2 hours) Recored on ESSE3 General introduction to the course.

Generalized Method of Moments Estimation

Transcription:

International Conference on Applied Probability and Computational Methods in Applied Sciences Organizing Committee Zhiliang Ying, Columbia University and Shanghai Center for Mathematical Sciences Jingchen Liu, Columbia University Yunxiao Chen, Columbia University Xiaoou Li, Columbia University Sponsor Shanghai Center for Mathematical Sciences Shanghai, China November 2-3, 2015

1 SCHEDULE Monday, Nov 2 Morning Session I Room 2201, Guanghua East Building, Fudan University 8:30-9:00 Peter Glynn, Stanford University, USA 9:05-9:35 Sandeep Juneja, Tata Institute of Fundamental Research, India 9:40-10:20 Coffee Break Morning Session II 10:20-10:50 Jürg Hüsler, University of Bern, Switzerland 10:55-11:25 Gongjun Xu, University of Minnesota, USA Lunch Afternoon Session I 2:00-2:30 Richard Davis, Columbia University, USA 2:35-3:05 Thomas Mikosch, University of Copenhagen, Denmark 3:10-3:50 Coffee Break Afternoon Session II 3:50-4:20 Henrik Hult, KTH Royal Institute of Technology, Sweden 4:25-4:55 Jinglai Li, Shanghai Jiaotong University, China Dinner Tuesday, Nov 3 Morning Session I Room 2201, Guanghua East Building, Fudan University 9:30-10:00 Yimin Xiao, Michigan State University, USA 10:05-10:35 Gennady Samorodnitsky, Cornell University, USA 10:40-11:00 Coffee Break Morning Session II 11:00-11:30 Jingchen Liu, Columbia University, USA Lunch

2 Abstracts Richard Davis On Central Limit Theorems for Weakly Dependent Random Fields with Applications In this talk, I will describe central limit theorems for sums of observations from a kappaweakly dependent random field. Specifically observations taken from a random field at irregularly spaced and possibly random locations in the domain of the random field are considered. The sums of these samples as well as sums of functions of pairs of the observations are often objects of interest for modeling spatial data. For example, if the underlying random field is strictly stationary, then one may be interested in estimated the mean and covariance functions by their sample counterparts. Central limit theorems for these quantities are established under quite general dependence conditions and various sampling scenarios for the locations of the observations. The illustration of these results for inference problems involving the stochastic heteroscedastic process (SHP) will be given. (This is joint work with Jingchen Liu and Xuan Yang.) Peter Glynn High Volume Traffic Modeling at Mesoscopic Time Scales Many applications call for stochastic models that appropriately describe incoming traffic to a congested resource (e.g. popular websites, call centers, server farms, etc). In this talk, we discuss some of the limitations of conventional Poisson modeling, and point out what we call the Poisson breakdown phenomenon. In particular, real-world high volume traffic often exhibits medium time-scale (mesoscopic scale) correlations that are difficult to detect at the microscopic time scale of individual inter-arrivals. In view of this challenge, we discuss a class of Poisson autoregressive models intended to model traffic at such mesoscopic scales. These models are probabilistically very tractable and also lend themselves to straightforward calibration. (This work is joint with Xiaowei Zhang.) Henrik Hult Efficient importance sampling for computing credit value adjustment of interest rate portfolios Prior to the financial crises in 2008, the credit risk in derivatives was not appropriately accounted for. Since then there has been an influx of value adjustments to derivative pricing collected under the name of XVA. The counterparty credit risk is this: A bank B trading derivatives with a counterparty C, is subject to the default risk of C. If C defaults at time T and the value of the derivative at that time is V (T ) > 0 for the bank, then most likely B will not collect the full value of the derivative. The loss is called Lost Given Default

3 LGD. On the other hand if the value is negative then the defaulting C will terminate the contract and use the money in the default. Therefore, at default, there is a asymmetry in the value, and the loss is LGD max(v (T ), 0). The credit value adjustment (CVA) is the expected loss of the counterparty defaulting before maturity, taking into account netting effects. In this talk I will show how to construct efficient importance sampling algorithms for computing the CVA for a portfolio of interest rate derivatives, when the short-rate follows a simple one-factor Hull-White model. Jürg Hüsler Massive Excursions of Trajectories of Gaussian Fields We consider the large excursion of a homogenous Gaussian random field X(t), t R 2, with mean zero and unit variance, in a bounded set T. Of main interest are the excursions in two time points t, s which are separated by a given ε > 0. The excursion depends on the correlation function r(t). We describe the asymptotic probability of such excursions and the path behaviour between the two time points t, s of exceedances. (This is joint work with Vladimir Piterbarg.) Sandeep Juneja Large deviations, selecting the best population and multi-armed bandit methods Consider the problem of finding a population amongst many with the smallest mean when these means are unknown but population samples can be generated. Typically, by selecting a population with the smallest sample mean, it can be shown that the false selection probability decays at an exponential rate. Lately researchers have sought algorithms that guarantee that this probability is restricted to a small δ in order log(1/δ) computational time by estimating the associated large deviations rate function via simulation. We show that such guarantees are misleading. Enroute, we identify the large deviations principle followed by the empirically estimated large deviations rate function that may also be of independent interest. Further, we show a negative result that when populations have unbounded support, any policy that asymptotically identifies the correct population with probability at least 1 δ for each problem instance requires more than O(log(1/δ)) samples in making such a determination in any problem instance. This suggests that some restrictions are essential on populations to devise O(log(1/δ)) algorithms with 1 δ correctness guarantees. We note that under restriction on population moments, such methods are easily designed. Further, under similar restrictions, sequential methods from multi-armed bandit literature can also be adapted to devise such algorithms.

4 Jinglai Li Adaptive Independent Sampler MCMC for Infinite Dimensional Bayesian Inferences Many scientific and engineering problems require to perform Bayesian inferences in function spaces, in which the unknowns are of infinite dimension. In such problems, many standard Markov Chain Monte Carlo (MCMC) algorithms become arbitrary slow under the mesh refinement, which is referred to as being dimension dependent. In this work we develop an independence sampler based MCMC method for the infinite dimensional Bayesian inferences. We represent the proposal distribution as a mixture of a finite number of specially parametrized Gaussian measures. We show that under the chosen parametrization, the resulting MCMC algorithm is dimension independent. We also design an efficient adaptive algorithm to adjust the parameter values of the mixtures from the previous samples. Finally we provide numerical examples to demonstrate the efficiency and robustness of the proposed method, even for problems with multimodal posterior distributions. Jingchen Liu On the Level Crossing of Likelihood Functions Level crossing has been a classic topic. Most studies in the literature focus on Gaussian processes and more generally diffusion processes that often serve as the limiting objects of discrete processes. In this talk, we consider a generic parametric family of models and the level crossing of the likelihood function indexed by both the unknown parameter and the sample size. This study has implications on the asymptotic efficiency of several classic hypothesis testing procedures and the large deviations of the maximum likelihood estimator. Thomas Mikosch Nagaev-type large deviations for heavy-tailed processes In a series of papers in the 1960s and 1970s, A.V. and S.V Nagaev proved precise large deviation results for sums of iid random variables with regularly varying tails. In 1995 the notion of a regularly varying stationary sequence was introduced in a paper by Richard A. Davis and Tailen Hsing (Ann. Probab.). This means that the finite-dimensional distributions of such a sequence have regularly tails. Examples are linear processes with with regularly varying noise variables, GARCH processes and affine stochastic recursions considered by Kesten (Acta Math., 1973). We present large deviation results for regularly varying stationary sequences and indicate how they can be applied to prove limit theory for sums, point processes and maxima. (This is joint work with Olivier Wintenberger.)

5 Gennady Samorodnitsky Time-changed Extremal Process as a Random Sup Measure A functional limit theorem for the partial maxima of a long memory stable sequence produces a limiting process that can be described as a beta-power time change in the classical Frechet extremal process, for beta in a subinterval of the unit interval. Any such power time change in the extremal process for 0 < β < 1 produces a process with stationary maxincrements. This deceptively simple time change hides the much more delicate structure of the resulting process as a self-affine random sup measure. We uncover this structure and show that in a certain range of the parameters this random measure arises as a limit of the partial maxima of the same long memory stable sequence, but in a different space. These results open a way to construct a whole new class of self-similar Frechet processes with stationary max-increments. Yimin Xiao Estimation of Fractal Indices for Bivariate Gaussian Random Fields This talk is concerned with estimation of fractal indices of multivariate Gaussian random fields (or spatial processes). In the multivariate setting, it is important and challenging to quantify the effect of the cross dependence structure on the joint performance of the estimators. By extending the increment-based method (Chan and Wood, 2000, 2004; Anderes, 2010), we construct estimators for the fractal indices of a large class of locally stationary bivariate Gaussian random fields and investigated their joint statistical performance. We provide conditions on the cross covariance for the estimators to be asymptotically independent. The main results are applicable to the bivariate stationary Gaussian random fields with Matern cross covariance functions introduced by Kleiber, Gneiting and Schlather (JASA, 2010). (This talk is based on joint work with Yuzhen Zhou.) Gongjun Xu Rare-event Analysis for Extremal Eigenvalues of the Beta-Laguerre Ensemble In this talk we consider the extreme behavior of the extremal eigenvalues coming from the beta-laguerre ensemble, which is a generalization of the Wishart matrix and plays an important role in Multivariate Analysis. In particular, we focus on the case when the dimension of the feature p is much larger than or comparable to the number of observations n, a common situation in modern data analysis. We provide asymptotic approximations and bounds for the tail probabilities of the extremal eigenvalues. Moreover, we construct efficient Monte Carlo simulation algorithms to compute the tail probabilities. Simulation results show that our method has the best performance amongst known approximation approaches, and furthermore provides an efficient and accurate way for evaluating the tail probabilities in practice.