Some illustrations of possibilistic correlation

Similar documents
CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

Topic 9: Sampling Distributions of Estimators

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

STAT Homework 1 - Solutions

Topic 9: Sampling Distributions of Estimators

Random Variables, Sampling and Estimation

Lecture 7: Properties of Random Samples

Probability and statistics: basic terms

Properties of Fuzzy Length on Fuzzy Set

Quick Review of Probability

Quick Review of Probability

Topic 9: Sampling Distributions of Estimators

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

An Introduction to Asymptotic Theory

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

4. Partial Sums and the Central Limit Theorem

THE KALMAN FILTER RAUL ROJAS

1 Inferential Methods for Correlation and Regression Analysis

Generalization of Contraction Principle on G-Metric Spaces

Joint Probability Distributions and Random Samples. Jointly Distributed Random Variables. Chapter { }

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

STAT Homework 2 - Solutions

BHW #13 1/ Cooper. ENGR 323 Probabilistic Analysis Beautiful Homework # 13

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Machine Learning Brett Bernstein

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

EE 4TM4: Digital Communications II Probability Theory

Econ 325: Introduction to Empirical Economics

Chapter 2 The Monte Carlo Method

Solution. 1 Solutions of Homework 1. Sangchul Lee. October 27, Problem 1.1

This section is optional.

1 Convergence in Probability and the Weak Law of Large Numbers

Lecture 15: Learning Theory: Concentration Inequalities

Journal of Multivariate Analysis. Superefficient estimation of the marginals by exploiting knowledge on the copula

TR/46 OCTOBER THE ZEROS OF PARTIAL SUMS OF A MACLAURIN EXPANSION A. TALBOT

An Introduction to Randomized Algorithms

Distribution of Random Samples & Limit theorems

A Note on the Kolmogorov-Feller Weak Law of Large Numbers

The Choquet Integral with Respect to Fuzzy-Valued Set Functions

MAT1026 Calculus II Basic Convergence Tests for Series

LECTURE 8: ASYMPTOTICS I

Rademacher Complexity

MATH301 Real Analysis (2008 Fall) Tutorial Note #7. k=1 f k (x) converges pointwise to S(x) on E if and

Section 14. Simple linear regression.

[412] A TEST FOR HOMOGENEITY OF THE MARGINAL DISTRIBUTIONS IN A TWO-WAY CLASSIFICATION

TAMS24: Notations and Formulas

[ ] ( ) ( ) [ ] ( ) 1 [ ] [ ] Sums of Random Variables Y = a 1 X 1 + a 2 X 2 + +a n X n The expected value of Y is:

Expectation and Variance of a random variable

The target reliability and design working life

Bayesian Methods: Introduction to Multi-parameter Models

Comparison of Minimum Initial Capital with Investment and Non-investment Discrete Time Surplus Processes

Precise Rates in Complete Moment Convergence for Negatively Associated Sequences

MATH 31B: MIDTERM 2 REVIEW

Law of the sum of Bernoulli random variables

CONTENTS. Course Goals. Course Materials Lecture Notes:

ANOTHER WEIGHTED WEIBULL DISTRIBUTION FROM AZZALINI S FAMILY

Monte Carlo Integration

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Asymptotic distribution of products of sums of independent random variables

Access to the published version may require journal subscription. Published with permission from: Elsevier.

Unbiased Estimation. February 7-12, 2008

Elements of Statistical Methods Lots of Data or Large Samples (Ch 8)

MATHEMATICAL SCIENCES PAPER-II

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

1 Introduction to reducing variance in Monte Carlo simulations

2.4 Sequences, Sequences of Sets

Lecture 11 and 12: Basic estimation theory

Estimation for Complete Data

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

University of Colorado Denver Dept. Math. & Stat. Sciences Applied Analysis Preliminary Exam 13 January 2012, 10:00 am 2:00 pm. Good luck!

STA Object Data Analysis - A List of Projects. January 18, 2018

Lecture 2: Monte Carlo Simulation

SAMPLING LIPSCHITZ CONTINUOUS DENSITIES. 1. Introduction

Introducing a Novel Bivariate Generalized Skew-Symmetric Normal Distribution

DISTRIBUTION LAW Okunev I.V.

32 estimating the cumulative distribution function

11 Correlation and Regression

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Stat 421-SP2012 Interval Estimation Section

Properties and Hypothesis Testing

Learning Theory: Lecture Notes

Goodness-Of-Fit For The Generalized Exponential Distribution. Abstract

Mathematical Statistics - MS

7.1 Convergence of sequences of random variables

INEQUALITIES BJORN POONEN

Zeros of Polynomials

A statistical method to determine sample size to estimate characteristic value of soil parameters

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

Variance of Discrete Random Variables Class 5, Jeremy Orloff and Jonathan Bloom

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

STATISTICAL METHODS FOR BUSINESS

Linear Regression Models

The value of Banach limits on a certain sequence of all rational numbers in the interval (0,1) Bao Qi Feng

Clases 7-8: Métodos de reducción de varianza en Monte Carlo *

PAPER : IIT-JAM 2010

A Note on the Distribution of the Number of Prime Factors of the Integers

TMA4245 Statistics. Corrected 30 May and 4 June Norwegian University of Science and Technology Department of Mathematical Sciences.

Transcription:

Some illustratios of possibilistic correlatio Robert Fullér IAMSR, Åbo Akademi Uiversity, Joukahaisekatu -5 A, FIN-252 Turku e-mail: rfuller@abofi József Mezei Turku Cetre for Computer Sciece, Joukahaisekatu -5 B, FIN-252 Turku e-mail: jmezei@abofi Péter Várlaki Budapest Uiversity of Techology ad Ecoomics, Bertala L u 2, H-1111 Budapest, Hugary ad Szécheyi Istvá Uiversity, Egyetem tér 1, H-926 Győr, Hugary e-mail: varlaki@kmebmehu Abstract: I this paper we will show some examples for possibilistic correlatio I particular, we will show (i) how the possibilistic correlatio coefficiet of two liear margial possibility distributios chages from zero to -1/2, ad from -1/2 to -/5 by takig out bigger ad bigger parts from the level sets of a their joit possibility distributio; (ii) how to compute the autocorrelatio coefficiet of fuzzy time series with liear fuzzy data 1 Itroductio A fuzzy umber A is a fuzzy set R with a ormal, fuzzy covex ad cotiuous membership fuctio of bouded support The family of fuzzy umbers is deoted by F Fuzzy umbers ca be cosidered as possibility distributios A fuzzy set C i R 2 is said to be a joit possibility distributio of fuzzy umbers A, B F, if it satisfies the relatioships maxx C(x, y)} = B(y) ad maxy C(x, y)} = A(x) for all x, y R Furthermore, A ad B are called

the margial possibility distributios of C Let A F be fuzzy umber with a γ- level set deoted by [A] γ = [a 1 (γ), a 2 (γ)], γ [, 1] ad let U γ deote a uiform probability distributio o [A] γ, γ [, 1] I possibility theory we ca use the priciple of expected value of fuctios o fuzzy sets to defie variace, covariace ad correlatio of possibility distributios Namely, we equip each level set of a possibility distributio (represeted by a fuzzy umber) with a uiform probability distributio, the apply their stadard probabilistic calculatio, ad the defie measures o possibility distributios by itegratig these weighted probabilistic otios over the set of all membership grades These weights (or importaces) ca be give by weightig fuctios A fuctio f : [, 1] R is said to be a weightig fuctio if f is o-egative, mootoe icreasig ad satisfies the followig ormalizatio coditio f(γ)dγ = 1 Differet weightig fuctios ca give differet (case-depedet) importaces to level-sets of possibility distributios I 29 we itroduced a ew defiitio of possibilistic correlatio coefficiet (see [2]) that improves the earlier defiitio itroduced by Carlsso, Fullér ad Majleder i 25 (see [1]) Defiitio ([2]) The f-weighted possibilistic correlatio coefficiet of A, B F (with respect to their joit distributio C) is defied by where ρ f (A, B) = ρ(x γ, Y γ ) = ρ(x γ, Y γ )f(γ)dγ (1) cov(x γ, Y γ ) var(xγ ) var(y γ ) ad, where X γ ad Y γ are radom variables whose joit distributio is uiform o [C] γ for all γ [, 1], ad cov(x γ, Y γ ) deotes their probabilistic covariace I other words, the f-weighted possibilistic correlatio coefficiet is othig else, but the f-weighted average of the probabilistic correlatio coefficiets ρ(x γ, Y γ ) for all γ [, 1] Cosider the case, whe A(x) = B(x) = (1 x) χ [,1] (x), for x R, that is [A] γ = [B] γ = [, 1 γ], for γ [, 1] Suppose that their joit possibility distributio is give by F (x, y) = (1 x y) χ T (x, y), where T = (x, y) R 2 x, y, x + y 1 } The we have [F ] γ = (x, y) R 2 x, y, x + y 1 γ }

Figure 1: Illustratio of joit possibility distributio F This situatio is depicted o Fig 1, where we have shifted the fuzzy sets to get a better view of the situatio I this case the f-weighted possibilistic correlatio of A ad B is computed as (see [2] for details), ρ f (A, B) = 1 2 f(γ)dγ = 1 2 Cosider ow the case whe A(x) = B(x) = x χ [,1] (x) for x R, that is [A] γ = [B] γ = [γ, 1], for γ [, 1] Suppose that their joit possibility distributio is give by W (x, y) = maxx + y 1, } The we get 1 ρ f (A, B) = 2 f(γ)dγ = 1 2 We ote here that W is othig else but the Lukasiewitz t-orm, or i the statistical literature, W is geerally referred to as the lower Fréchet-Hoeffdig boud for copulas

2 A trasitio from zero to -1/2 Suppose that a family of joit possibility distributio of A ad B (where A(x) = B(x) = (1 x) χ [,1] (x), for x R) is defied by C (x, y) = 1 x 1 1 y, if x 1, x y, 1 1 y + x 1 1 x y, if y 1, y x, x + y 1, otherwise I the followig, for simplicity, we well write C istead of C A γ-level set of C is computed by [C] γ = (x, y) R 2 x } 1 (1 γ), y 1 γ 2 1 x } (x, y) R 2 2 1 1 (1 γ) x 1 γ, y 1 γ x The desity fuctio of a uiform distributio o [C] γ ca be writte as 1, if (x, y) [C]γ 2 1, if (x, y) [C]γ f(x, y) = [C] dxdy γ = (1 γ) 2 We ca calculate the margial desity fuctios: (2 1)(1 γ x) ( 1)(1 γ) 2, if (1 γ) x 1 γ 2 1 f 1 (x) = (2 1)(1 γ 1 x) (1 γ) 2, if x (1 γ) 2 1 ad, (2 1)(1 γ y) ( 1)(1 γ) 2, if (1 γ) y 1 γ 2 1 f 2 (y) = (2 1)(1 γ 1 y) (1 γ) 2, if y (1 γ) 2 1

We ca calculate the probabilistic expected values of the radom variables X γ ad Y γ, whose joit distributio is uiform o [C] γ for all γ [, 1] as, M(X γ ) = 2 1 (1 γ) 2 1 (1 γ) 2 + 2 1 ( 1)(1 γ) 2 x(1 γ 1 x)dx γ (1 γ) 2 1 x(1 γ x)dx = (1 γ)(4 1) 6(2 1) ad, M(Y γ ) = (1 γ)(4 1) 6(2 1) (We ca easily see that for = 1 we have M(X γ ) = 1 γ, ad for we 2 fid M(X γ ) 1 γ ) We calculate the variatios of X γ ad Y γ as, M(Xγ) 2 = 2 1 (1 γ) 2 1 (1 γ) 2 + 2 1 ( 1)(1 γ) 2 x 2 (1 γ 1 x)dx γ (1 γ) 2 1 x 2 (1 γ x)dx = (1 γ)2 ((2 1) + 8 6 2 + ) 12(2 1) (We ca easily see that for = 1 we get M(X 2 γ) = fid M(X 2 γ) (1 γ)2 ) Furthermore, 6 (1 γ)2, ad for we var(x γ ) = M(X 2 γ) M(X γ ) 2 = (1 γ)2 ((2 1) + 8 6 2 + ) 12(2 1) Ad similarly we obtai (1 γ)2 (4 1) 2 6(2 1) 2 = (1 γ)2 (2(2 1) 2 + ) 6(2 1) 2 var(y γ ) = (1 γ)2 (2(2 1) 2 + ) 6(2 1) 2 (We ca easily see that for = 1 we get var(x γ ) = (1 γ)2, ad for we 12

fid var(x γ ) (1 γ)2 ) Ad, 18 cov(x γ, Y γ ) = M(X γ Y γ ) M(X γ )M(Y γ ) = (1 γ)2 (4 1) 12(2 1) 2 (1 γ)2 (1 )(4 1) 6(2 1) 2 (We ca easily see that for = 1 we have cov(x γ, Y γ ) =, ad for we (1 γ)2 fid cov(x γ, Y γ ) ) We ca calculate the probabilisctic correlatio 6 of the radom variables, ρ(x γ, Y γ ) = cov(x γ, Y γ ) var(xγ ) (1 )(4 1) = var(y γ ) 2(2 1) 2 + (We ca easily see that for = 1 we have ρ(x γ, Y γ ) =, ad for we fid ρ(x γ, Y γ ) 1 ) Ad fially the f-weighted possibilistic correlatio of A ad 2 B is computed as, ρ f (A, B) = ρ(x γ, Y γ )f(γ)dγ = (1 )(4 1) 2(2 1) 2 + We obtai, that ρ f (A, B) = for = 1 ad if the ρ f (A, B) 1 2 A trasitio from -1/2 to -/5 Suppose that the joit possibility distributio of A ad B (where A(x) = B(x) = (1 x) χ [,1] (x), for x R) is defied by C (x, y) = (1 x y) χ T (x, y), where T = } (x, y) R 2 1 x, y, x + y 1, 1 x y (x, y) R 2 x, y, x + y 1, ( 1)x y } I the followig, for simplicity, we well write C istead of C A γ-level set of C is computed by [C] γ = (x, y) R 2 x 1 } (1 γ), ( 1)x y 1 γ x

(x, y) R 2 y 1 (1 γ), ( 1)y x 1 γ y } The desity fuctio of a uiform distributio o [C] γ ca be writte as f(x, y) = 1, if (x, y) [C]γ [C] dxdy γ, if (x, y) [C]γ = (1 γ) 2 We ca calculate the margial desity fuctios: (1 γ x + x 1 ) (1 γ) 2, if x 1 γ x (1 γ) ( 1)(1 γ) f 1 (x) = (1 γ) 2, if x ( 1) (1 γ x) ( 1)(1 γ) (1 γ) 2, if x 1 γ ad, f 2 (y) = (1 γ y + y 1 ) (1 γ) 2, if y 1 γ y (1 γ) ( 1)(1 γ) (1 γ) 2, if y ( 1) (1 γ y) ( 1)(1 γ) (1 γ) 2, if y 1 γ We ca calculate the probabilistic expected values of the radom variables X γ ad Y γ, whose joit distributio is uiform o [C] γ for all γ [, 1] as M(X γ ) = (1 γ) 2 + (1 γ) 2 = 1 γ γ ( 1)(1 γ) 1 γ x(1 γ x + x 1 )dx x 2 1 dx + (1 γ) 2 γ) ( 1)(1 γ) x(1 γ x)dx

That is, M(Y γ ) = 1 γ We calculate the variatios of X γ ad Y γ as, ad, M(X 2 γ) = + + γ (1 γ) 2 (1 γ) 2 (1 γ) 2 ( 1)(1 γ) 1 γ γ) ( 1)(1 γ) x 2 (1 γ x + = (1 γ)2 ( 2 + 2) 12 2 x 1 dx x 2 (1 γ x)dx x 1 )dx var(x γ ) = M(Xγ) 2 M(X γ ) 2 = (1 γ)2 ( 2 + 2) (1 γ)2 12 2 9 = (1 γ)2 (5 2 9 + 6) 6 2 Ad, similarly, we obtai var(y γ ) = (1 γ)2 (5 2 9 + 6) 6 2 From cov(x γ, Y γ ) = M(X γ Y γ ) M(X γ )M(Y γ ) = (1 γ)2 ( 2) (1 γ)2 12 2, 9 we ca calculate the probabilisctic correlatio of the readom variables: ρ(x γ, Y γ ) = cov(x γ, Y γ ) var(xγ ) var(y γ ) = 2 + 7 6 5 2 9 + 6 Ad fially the f-weighted possibilistic correlatio of A ad B: ρ f (A, B) = We obtai, that for = 2 ad if, the ρ(x γ, Y γ )f(γ)dγ = 2 + 7 6 5 2 9 + 6 ρ f (A, B) = 1 2, ρ f (A, B) 5 We ote that i this extremal case the joit possibility distributio is othig else but the margial distributios themselves, that is, C (x, y) =, for ay iterior poit (x, y) of the uit square

4 A trapezoidal case Cosider the case, whe x, if x 1 1, if 1 x 2 A(x) = B(x) = x, if 2 x, otherwise for x R, that is [A] γ = [B] γ = [γ, γ], for γ [, 1] Suppose that their joit possibility distributio is give by: y, if x, y 1, x y, x y 1, if 1 x 2, 1 y 2, x y C(x, y) = x, if x 1, y, y x, x y, otherwise The [C] γ = (x, y) R 2 γ x γ, γ y x } The desity fuctio of a uiform distributio o [F ] γ ca be writte as 1, if (x, y) [C]γ 2, if (x, y) [C]γ f(x, y) = [C] dxdy γ = ( 2γ) 2 The margial fuctios are obtaied as 2( γ x) f 1 (x) = ( 2γ) 2, if γ x γ ad, f 2 (y) = 2( γ y) ( 2γ) 2, if γ y γ We ca calculate the probabilistic expected values of the radom variables X γ ad Y γ, whose joit distributio is uiform o [C] γ for all γ [, 1] : ad, M(X γ ) = M(Y γ ) = 2 γ ( 2γ) 2 x( γ x)dx = γ + γ 2 γ ( 2γ) 2 y( γ y)dx = γ + γ

We calculate the variatios of X γ ad Y γ from the formula var(x) = M(X 2 ) M(X) 2 as ad, M(X 2 γ) = 2 γ ( 2γ) 2 x 2 ( γ x)dx = 2γ2 + 9 γ 6 var(x γ ) = M(X 2 γ) M(X γ ) 2 = 2γ2 + 9 6 (γ + )2 9 = ( 2γ)2 18 Ad similarly we obtai Usig the relatioship, var(y γ ) = ( 2γ)2 18 cov(x γ, Y γ ) = M(X γ Y γ ) M(X γ )M(Y γ ) = ( 2γ)2, 6 we ca calculate the probabilisctic correlatio of the radom variables: ρ(x γ, Y γ ) = cov(x γ, Y γ ) var(xγ ) var(y γ ) = 1 2 Ad fially the f-weighted possibilistic correlatio of A ad B is, 1 ρ f (A, B) = 2 f(γ)dγ = 1 2 5 Time Series With Fuzzy Data A time series with fuzzy data is referred to as fuzzy time series (see []) Cosider a fuzzy time series idexed by t (, 1], A t (x) = 1 x t, if x t It is easy to see that i this case, ad A (x) = [A t ] γ = [, t(1 γ)], γ [, 1] 1, if x =

If we have t 1, t 2 [, 1], the the joit possibility distributio of the correspodig fuzzy umbers is give by: ) C(x, y) = (1 xt1 yt2 χ T (x, y), where The [C] γ = T = (x, y) R 2 x, y, x + y } 1 t 1 t 2 (x, y) R 2 x, y, x + y } 1 γ t 1 t 2 The desity fuctio of a uiform distributio o [C] γ ca be writte as That is, f(x, y) = 1, if (x, y) [C]γ [C] dxdy γ 2 f(x, y) = t 1 t 2 (1 γ) 2, if (x, y) [C]γ The margial fuctios are obtaied as 2(1 γ x t 1 ) f 1 (x) = t 1 (1 γ) 2, if x t 1 (1 γ) ad, 2(1 γ y t 2 ) f 2 (y) = t 2 (1 γ) 2, if y t 2 (1 γ) We ca calculate the probabilistic expected values of the radom variables X γ ad Y γ, whose joit distributio is uiform o [C] γ for all γ [, 1] : ad M(X γ ) = M(Y γ ) = 2 t 1 (1 γ) 2 2 t 2 (1 γ) 2 t1 (1 γ) t2 (1 γ) x(1 γ x t 1 )dx = t 1(1 γ) y(1 γ y )dx = t 2(1 γ) t 2

We calculate ow the variatios of X γ ad Y γ as, ad, M(X 2 γ) = 2 t 1 (1 γ) 2 t1 (1 γ) var(x γ ) = M(Xγ) 2 M(X γ ) 2 = t2 1 (1 γ)2 6 Ad, i a similar way, we obtai, From, x 2 (1 γ x )dx = t2 1 (1 γ)2 t 1 6 var(y γ ) = t2 2 (1 γ)2 18 t2 1 (1 γ)2 9 = t2 1 cov(x γ, Y γ ) = t 1t 2 (1 γ) 2, 6 we ca calculate the probabilisctic correlatio of the readom variables, ρ(x γ, Y γ ) = cov(x γ, Y γ ) var(xγ ) var(y γ ) = 1 2 The f-weighted possibilistic correlatio of A t1 ad A t2, ρ f (A t1, A t2 ) = 1 2 f(γ)dγ = 1 2 (1 γ)2 18 So, the autocorrelatio fuctio of this fuzzy time series is costat Namely, for all t 1, t 2 [, 1] R(t 1, t 2 ) = 1 2, Refereces [1] C Carlsso, R Fullér ad P Majleder, O possibilistic correlatio, Fuzzy Sets ad Systems, 155(25) 425-445 [2] R Fullér, J Mezei ad P Várlaki, A New Look at Possibilistic Correlatio, Fuzzy Sets ad Systems (submitted) [] B Möller, M Beer ad U Reuter, Theoretical Basics of Fuzzy Radomess - Applicatio to Time Series with Fuzzy Data, I: Safety ad Reliability of Egieerig Systems ad Structures - Proceedigs of the 9th It Coferece o Structural Safety ad Reliability, edited by G Augusti ad GI Schueller ad M Ciampoli Millpress, Rotterdam, pages 171-177