Random Vectors Part A

Similar documents
Variations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra

Continuous Random Variables

Work, Energy, and Power. Chapter 6 of Essential University Physics, Richard Wolfson, 3 rd Edition

Math 171 Spring 2017 Final Exam. Problem Worth

PART I (STATISTICS / MATHEMATICS STREAM) ATTENTION : ANSWER A TOTAL OF SIX QUESTIONS TAKING AT LEAST TWO FROM EACH GROUP - S1 AND S2.

Worksheets for GCSE Mathematics. Algebraic Expressions. Mr Black 's Maths Resources for Teachers GCSE 1-9. Algebra

CS Lecture 8 & 9. Lagrange Multipliers & Varitional Bounds

Monte Carlo (MC) simulation for system reliability

M.5 Modeling the Effect of Functional Responses

P.3 Division of Polynomials

Communication Theory II

Quantum Mechanics. An essential theory to understand properties of matter and light. Chemical Electronic Magnetic Thermal Optical Etc.

P.2 Multiplication of Polynomials

Chemical Engineering 412

F.3 Special Factoring and a General Strategy of Factoring

Dependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.

1.1 Review of Probability Theory

General Strong Polarization

Lecture 11. Kernel Methods

ECE 6540, Lecture 06 Sufficient Statistics & Complete Statistics Variations

Lecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher

Unit II. Page 1 of 12

ECE Lecture #10 Overview

Lesson 24: Using the Quadratic Formula,

TECHNICAL NOTE AUTOMATIC GENERATION OF POINT SPRING SUPPORTS BASED ON DEFINED SOIL PROFILES AND COLUMN-FOOTING PROPERTIES

MATH 1080: Calculus of One Variable II Fall 2018 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart.

Math 3 Unit 4: Rational Functions

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

A new procedure for sensitivity testing with two stress factors

Exercises with solutions (Set D)

Recitation 2: Probability

Lesson 9: Law of Cosines

Wave Motion. Chapter 14 of Essential University Physics, Richard Wolfson, 3 rd Edition

Modelling Dependence with Copulas and Applications to Risk Management. Filip Lindskog, RiskLab, ETH Zürich

2.4 Error Analysis for Iterative Methods

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Random Variables and Their Distributions

Classical and Bayesian inference

Research Article On Stress-Strength Reliability with a Time-Dependent Strength

Extreme value statistics: from one dimension to many. Lecture 1: one dimension Lecture 2: many dimensions

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Rotational Motion. Chapter 10 of Essential University Physics, Richard Wolfson, 3 rd Edition

PHL424: Nuclear Shell Model. Indian Institute of Technology Ropar

FINAL EXAM: 3:30-5:30pm

National 5 Mathematics. Practice Paper E. Worked Solutions

On The Cauchy Problem For Some Parabolic Fractional Partial Differential Equations With Time Delays

Review (Probability & Linear Algebra)

Question Points Score Total: 76

Time Domain Analysis of Linear Systems Ch2. University of Central Oklahoma Dr. Mohamed Bingabr

Lecture No. 1 Introduction to Method of Weighted Residuals. Solve the differential equation L (u) = p(x) in V where L is a differential operator

Name: Discussion Section:

Estimate by the L 2 Norm of a Parameter Poisson Intensity Discontinuous

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Lecture No. 5. For all weighted residual methods. For all (Bubnov) Galerkin methods. Summary of Conventional Galerkin Method

Classical RSA algorithm

Quick Tour of Basic Probability Theory and Linear Algebra

F.1 Greatest Common Factor and Factoring by Grouping

Lecture 1: August 28

Functions of Random Variables Notes of STAT 6205 by Dr. Fan

Multiple Random Variables

Appendix A : Introduction to Probability and stochastic processes

Review for Exam Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa

Angular Momentum, Electromagnetic Waves

Worksheets for GCSE Mathematics. Quadratics. mr-mathematics.com Maths Resources for Teachers. Algebra

Lesson 14: Solving Inequalities

Analyzing and improving the reproducibility of shots taken by a soccer robot

Mathematics Ext 2. HSC 2014 Solutions. Suite 403, 410 Elizabeth St, Surry Hills NSW 2010 keystoneeducation.com.

Review of Probability. CS1538: Introduction to Simulations

Math 1312 Sections 1.2, 1.3, and 1.4 Informal Geometry and Measurement; Early Definitions and Postulates; Angles and Their Relationships

General Strong Polarization

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Module 4 (Lecture 16) SHALLOW FOUNDATIONS: ALLOWABLE BEARING CAPACITY AND SETTLEMENT

2. The CDF Technique. 1. Introduction. f X ( ).

Algorithms for Uncertainty Quantification

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Multiple Regression Analysis: Inference ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Joint Probability Distributions, Correlations

Lecture 2 One too many inequalities

University of North Georgia Department of Mathematics

Lecture Notes to Big Data Management and Analytics Winter Term 2017/2018 Text Processing and High-Dimensional Data

Modelling Dependent Credit Risks

Lesson 1: Successive Differences in Polynomials

L.5 Operations on Sets and Compound Inequalities

Chapter 2. Probability

Manipulator Dynamics (1) Read Chapter 6

3. Probability and Statistics

Chapter 4 Multiple Random Variables

conditional cdf, conditional pdf, total probability theorem?

Analysis of Survival Data Using Cox Model (Continuous Type)

CHAPTER 5 Wave Properties of Matter and Quantum Mechanics I

More than one variable

L2: Review of probability and statistics

The Instability of Correlations: Measurement and the Implications for Market Risk

Math, Stats, and Mathstats Review ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

3.2 A2 - Just Like Derivatives but Backwards

Chapter 5: Spectral Domain From: The Handbook of Spatial Statistics. Dr. Montserrat Fuentes and Dr. Brian Reich Prepared by: Amanda Bell

Chapter 2: Random Variables

Transcription:

Random Vectors Part A Page 1 Outline Random Vectors Measurement of Dependence: Covariance Other Methods of Representing Dependence Set of Joint Distributions Copulas Common Random Vectors Functions of Random Vectors One-to-one mappings Many-to-one mappings

Random Vectors Page 2 A random vector [XX 1, XX 2,, XX NN ] is made up of dependent random variables XX 1, XX 2,, XX NN. Ex: Temperature and wind speeds in the next two days in Dallas: [TT 0, VV 0, TT 1, VV 1 ] Dependence between TT 1 and VV 1 ; temperature and wind of tomorrow Dependence between TT 0 and TT 1 ; temperatures of today and tomorrow Joint cdf of random variables FF 1,2 xx 1, xx 2 = P(XX 1 xx 1, XX 2 xx 2 ). The random vector [XX 1, XX 2 ] is defined through its joint cdf FF 1,2 (xx 1, xx 2 ). Joint cdf properties Nondecreasing: FF 1,2 xx 1, xx 2 FF 1,2 xx 1 + uu, xx 2 + vv for uu, vv 0 Right-continuous: lim uu,vv 0 + FF 1,2(xx 1 + uu, xx 2 + vv)=ff 1,2 xx 1, xx 2 0-probability: lim xx 1 FF 1,2(xx 1, xx 2 )= lim xx 2 FF 1,2 xx 1, xx 2 = 0 1-probability: lim xx 1,xx 2 FF 1,2(xx 1, xx 2 )=1 Reduction to Marginal cdfs: FF 1 xx 1 = lim FF 1,2 xx 1, xx 2 xx 2 FF 2 xx 2 = lim FF 1,2 xx 1, xx 2 xx 1 and xx 2 + vv xx 2 XX 22 Surface is ff 1,2 (xx 1, xx 2 ) Submodularity: FF 1,2 xx 1 + uu, xx 2 + vv +FF 1,2 xx 1, xx 2 FF 1,2 xx 1 + uu, xx 2 FF 1,2 xx 1, xx 2 + vv 0 for uu, vv 0 Moment generating function of [XX 1,, XX NN ]: exp nn=1 tt nn XX nn Joint cdf is more informative than marginals Marginals uniquely the Joint Joint non-uniquely marginals; more on this later xx 1 xx 1 + uu XX 11 xx 1 +uu xx 2 +vv 1 xx 2ff1,2 0 ff 1,2 aa, bb dddddddd + xx aa, bb dddddddd xx 1 +uu 2ff1,2 xx 1 xx 2 +vv xx aa, bb dddddddd ff 1,2 aa, bb dddddddd

Discrete Random Vector Ex: Tossing of a fair coin 3 times, XX: the total number of Heads and YY: the number Heads on the first toss. XX {0,1,2,3}, YY {0,1} and XX YY, the inequality hints at dependence between XX and YY Probabilities P(XX = 0, YY = 0) = P(TTTTTT) = 1/8, P(XX = 1, YY = 0) = P(TTTTTT or TTTTTT) = 2/8, P(XX = 1, YY = 1) = P(HHTTTT) = 1/8, P XX = 2, YY = 0 = P(TTTTTT) = 1/8, P(XX = 2, YY = 1) = P(HHTTHH or HHHHTT) = 2/8, P(XX = 3, YY = 1) = P(HHHHHH) = 1/8. Cdf over {0,1,2,3} {0,1} {0,1} : 7 points FF XX,YY (0,0) = P(XX = 0, YY = 0) = 1/8, FF XX,YY 1,0 = FF XX,YY (0,0) + P(XX = 1, YY = 0) = 3/8, FF XX,YY (1,1) = FF XX,YY (1,0) + P(XX = 1, YY = 1) = 4/8, FF XX,YY (2,0) = FF XX,YY (1,0) + P(XX = 2, YY = 0) = 4/8, FF XX,YY 3,0 = FF XX,YY (2,0) = 4/8, FF XX,YY (2,1) = FF XX,YY (2,0) + P(XX = 1, YY = 1) + P(XX = 2, YY = 1) = FF XX,YY (1,1) + P(XX = 2, YY = 0) + P(XX = 2, YY = 1) = 7/8, FF XX,YY 3,1 = FF XX,YY (2,1) + P(XX = 3, YY = 1) = 8/8. Marginal pmf of XX P(XX = 0) = P(XX = 0, YY = 0) = 1/8, P(XX = 1) = P(XX = 1, YY = 0) + P(XX = 1, YY = 1) = 3/8, P(XX = 2) = P(XX = 2, YY = 0) + P(XX = 2, YY = 1) = 3/8, P(XX = 3) = P(XX = 3, YY = 1) = 1/8. Marginal pmf of YY: P(YY = 0) = 4/8 and P(YY = 1) = 4/8 2 Page 3 Y 1 1 0 Cdf takes the value indicated by black dots not the value indicated by yellow dots 3 X

A Property of Joint CDF and the Tail Page 4 Ex: For the joint tail probability FF XX xx 1, xx 2 = P [XX 1 xx 1 ] [XX 2 xx 2 ], we have FF XX xx 1, xx 2 = FF 1 xx 1 + FF 2 xx 2 + FF XX xx 1, xx 2 1 Starting with 1 P [XX 1 xx 1 ] [XX 2 xx 2 ] = P [XX 1 xx 1 [XX 2 xx 2 ]) = P [XX 1 xx 1 ) + P([XX 2 xx 2 ]) P [XX 1 xx 1 [XX 2 xx 2 ]) Reorganizing P [XX 1 xx 1 [XX 2 xx 2 ]) = P [XX 1 xx 1 ) + P XX 2 xx 2 1 + P [XX 1 xx 1 ] [XX 2 xx 2 ] This yields the desired equality Ex: FF XX xx 1, xx 2 + FF XX xx 1, xx 2 < 1 when P [XX 1 xx 1 ] [XX 2 xx 2 ] > 0 or P [XX 1 xx 1 ] [XX 2 xx 2 ] > 0

Expected Value of Sums of Dependent RVs Page 5 Regardless of independence of XX 1, XX 2, the linearity of expectation holds: E gg 1 XX 1 + gg 2 XX 2 = E gg 1 XX 1 + E gg 2 XX 2 for functions gg 1, gg 2 Ex: An OM course is taken by 5 students. Independent of his classmates, each of the students pursue a degree program: SCM, Finance, Marketing, Accounting, MBA and PhD. A student in this course can be in any one of the programs with equal probability. What is the expected number of degree programs represented in this course? Let XX SSSSSS = 1 if at least 1 student in the course pursues MS in SCM. Similarly define XX FFFFFF, XX MMMMMM, XX AAAAAA, XX MMMMMM, XX PPPPP. The number of degree programs represented is XX XX = XX SSSSSS + XX FFFFFF + XX MMMMMM + XX AAAAAA + XX MMMMMM + XX PPPPP [XX SSSSSS, XX FFFFFF, XX MMMMMM, XX AAAAAA, XX MMMMMM, XX PPPPP ] are not independent: 5 students and 6 degree programs, 1 XX SSSSSS + XX FFFFFF + XX MMMMMM + XX AAAAAA + XX MMMMMM + XX PPPPP 5 Dependence because of Probability of XX SSSSSS = 1 depends on conditioned events 5 students & 6 degree programs P XX SSSSSS = 1 XX FFFFFF = XX MMMMMM = XX AAAAAA = XX MMMMMM = XX PPPPP = 0 = 1 > 0 = P XX SSSSSS = 1 XX FFFFFF = XX MMMMMM = XX AAAAAA = XX MMMMMM = XX PPPPP = 1 7 students & 6 degree programs P XX SSSSSS = 1 XX FFFFFF = XX MMMMMM = XX AAAAAA = XX MMMMMM = XX PPPPP = 0 = 1 > 1 1 1 2 = 1 P Neither in SCM = P At least 1 of 2 students in SSSSSS = 6 P XX SSSSSS = 1 XX FFFFFF = XX MMMMMM = XX AAAAAA = XX MMMMMM = XX PPPPP = 1 Dependent program representation Independent choice by each student

Expected Number of Programs Represented in the Class Page 6 Regardless of independence of XX 1, XX 2, the linearity of expectation holds: E gg 1 XX 1 + gg 2 XX 2 = E gg 1 XX 1 + E gg 2 XX 2 for functions gg 1, gg 2 Ex: Despite dependence, E(XX) = E(XX SSSSSS ) + E(XX FFFFFF ) + E(XX MMMMMM ) + E(XX AAAAAA ) + E(XX MMMMMM ) + E(XX PPPPP ) Since XX SSSSSS is binary E XX SSSSSS =P(One or more SCM students)= 1 P(No SCM student) o P(No SCM student)= P(Student 1, Student 2,, Student 5 pursue other than SCM) Each student independently pursues a program: P(Student 1, Student 2 pursue Finance)= P(Student 1 in Finance)P(Student 2 in Finance) A student pursue any one of the programs with equal probability: P(Student 1, Student 2 pursue other than SCM)=(1-1/6)(1-1/6) o P(No SCM student)= P(Student 1, Student 2,, Student 5 pursue other than SCM)= 5 6 5 Dependent program representation Independent choice by each student E XX SSSSSS = 1 P(No SCM student)= 1 5 6 E(XX) = 6 1 5 6 5 5 = E(XXFFFFFF ) = E XX MMMMMM = E(XX AAAAAA ) = E(XX MMMMMM ) = E(XX PPPPP )

Generalizing the Example of Number of Degree Programs Page 7 Ex: A course is taken by mm students. Independent of his classmates, each of the students pursue a degree program out of nn programs. A student in this course can be in any one of the programs with equal probability. What is the expected number of degree programs represented in this course? Let XX ii = 1 if at least 1 student in the course pursues program ii. The number of degree programs XX = ii=1 [XX 1, XX 2,, XX nn ] are not independent: mm students and nn degree programs, 1 nn ii=1 XX ii min{mm, nn} Despite dependence, E XX = nn ii=1 E(XX ii ) Since XX ii is binary E XX ii =P(One or more students pursue program ii) = 1 P(No students for program ii) o Because of Each student independently pursues a program: A student pursue any one of the programs with equal probability: P(No students for program ii)= 1 1 nn E XX ii = 1 P(No students for program ii)= 1 1 1 nn mm mm and E(XX) = nn 1 1 1 nn mm nn XX ii A lot of degree programs nn, every program gets at most one student, some get exactly one, lim E XX = lim nn nn 1 1 1 nn 1 n mm = lim n mm 1 1 nn 1 n 2 mm 1 1 nn 2 = lim n mm 1 1 nn mm = mm = Number of students A lot of students mm, every program gets at least one student, lim E XX = lim nn 1 1 1 mm = nn lim 1 1 1 mm = nn = Number of degree programs mm mm nn m nn

Independence of Random Variables Page 8 Although expected values are useful, they are not sufficiently detailed to assess probabilities. Then we must work with probability functions. When random variables XX, YY are independent, i.e., XX YY, P XX aa, YY bb = P XX aa P(YY bb) which implies for discrete rvs, pp XX,YY aa, bb = P XX = aa, YY = bb = pp XX aa pp YY (bb) aa bb continuous rvs, ffxx,yy xx, yy dddddddd = P XX aa, YY bb = P XX aa P(YY bb) = aa bb ffxx xx ff YY (yy)ddyyddxx, which in turn implies ff XX,YY xx, yy = ff XX xx ff YY (yy). We have the following equalities for the marginal and joint density aa ffxx xx dddd = P XX aa = lim P(XX aa, YY bb) = bb ffxx,yy xx, yy ddyyddxx for every aa Then the marginal density is ff XX xx dddd = ffxx,yy xx, yy dddd. aa Ex: Are XX, YY independent if ff XX,YY xx, yy = I xx,yy 0 4xxxx exp( xx 2 yy 2 )? We check for marginal pdfs, ff XX xx = 0 4xxxx exp xx 2 yy 2 dddd = 2xx exp xx 2 0 2yy exp yy 2 dddd = 2xx exp xx 2. By symmetry, ff YY yy = I yy 0 2yy exp yy 2. Hence, ff XX,YY xx, yy = ff XX xx ff YY (yy) and rvs are independent.

Covariance: Measurement of Dependence Page 9 Covariance is a measure of dependence Cov XX, YY = E XX E XX YY E YY = E XXYY E XX)E(YY Ex: Cov NN ii=1 aa ii XX ii, MM jj=1 bb jj YY jj = NN ii=1 = ii=1 MM jj aa ii bb jj E(XX ii YY jj ) NN ii=1 MM jj aa ii bb jj E(XX ii )E(YY jj ) NN jj MM aa ii bb jj Cov(XX ii, YY jj ) Independence is more informative than Covariance=0 Independence Covariance = 0 Dependent random variables can also have Covariance = 0, see the next example Ex: Let XX, YY discrete random variables have the joint pmf in the following table. pp XX,YY (xx, yy) yy = 00 yy = 11 yy = 22 pp XX (xx) xx = 00 1/4 0 1/4 1/2 xx = 11 0 1/2 0 1/2 pp YY (yy) 1/4 1/2 1/4 Variables are dependent: pp XX,YY 0,1 = 0 1 2 1 2 = pp XX 0 pp YY (1). Cov XX, YY = 0: E XXXX = 1 1 2 + 2 0 = 1 2, E XX = 1 1 2 = 1 2 and E YY = 1 1 2 + 2 1 4 = 1 This is unfortunate we cannot in general deduce independence from covariance.

Covariance: Alternative Formula Ex: For covariance, integrate the difference between joint tail and products of marginal tails CCCCCC XX, YY = Let XX 1, YY 1, (XX 2, YY 2 ) be iid with (XX, YY). [P XX uu, YY vv P XX uu P YY vv ] dddddddd Page 10 CCCCCC XX, YY = 1 2 E XX 1YY 1 E XX 1 E YY 1 + E XX 2 YY 2 E XX 2 E YY 2 = 1 2 E XX 1YY 1 E XX 1 E YY 2 + E XX 2 YY 2 E XX 2 E YY 1 = 1 2 E XX 1 XX 2 YY 1 YY 2 xx For xx 1 xx 2 left-below, xx 1 xx 2 = 1 xx2 dddd = 11xx2 uu xx 1 dddd = (11uu xx1 11 uu xx2 )dddd - xx 2 + xx 1 For xx 1 xx 2 right-above, xx 1 xx 2 = xx1 + - xx 1 xx 2 ( 1)dddd = 11xx1 uu xx 2 dddd = (11uu xx1 11 uu xx2 )dddd CCCCCC XX, YY = 1 E XX 2 1 XX 2 YY 1 YY 2 = 1 E 2 (11uu XX1 11 uu XX2 )dddd (11vv YY1 11 vv YY2 )dddd = 1 2 E((11uu XX1 11 uu XX2 )(11 vv YY1 11 vv YY2 ))dddddddd xx 2 = 1 2 E(11uu XX1 11 vv YY1 ) E 11 uu XX1 )E(11 vv YY2 E 11 uu XX2 )E(11 vv YY1 + E(11 uu XX2 11 vv YY2 )dddddddd by iid blue = green = E(11uu XX1 11 vv YY1 ) E 11 uu XX1 )E(11 vv YY2 dddddddd, which is the desired equality

Other Methods for Representing Dependence Set of Joint Distributions Marginal non-unique a joint cdf Ex: For given marginal {FF 1, FF 2 }, consider two alternative cdfs FF 12 xx 1, xx 2 = min{ff 1 xx 1, FF 2 (xx 2 )} and GG 12 xx 1, xx 2 = FF 1 xx 1 FF 2 (xx 2 ) Both joint cdfs correspond to the given marginals!! Both have non-decreasing, right-continuous, 0-probability, 1-probability and submodularity. More importantly FF 12 and GG 12 match marginals lim FF 1,2 xx 1, xx 2 = lim min FF 1 xx 1, FF 2 xx 2 = FF 2 xx 2, similarly for FF 1 xx 1 xx 1 lim GG 1,2 xx 1, xx 2 = lim FF 1 xx 1 FF 2 (xx 2 ) = FF 2 xx 2, similarly for FF 1 xx 1 xx 1 Given marginals, a joint cdf is not unique Page 11 Given marginals, a set of cdfs RR{FF 1, FF 2,, FF NN } Ex: Each FF RR{FF 1, FF 2 } has a Lower Bound and an Upper Bound max FF 1 xx 1 + FF 2 xx 2 1,0 FF xx 1, xx 2 min{ff 1 xx 1, FF 2 xx 2 } For the lower bound, FF xx 1, xx 2 = P [XX 1 xx 1 ] [XX 2 xx 2 ] = 1 P [XX 1 xx 1 ] [XX 2 xx 2 ] because blue=red c Also FF xx 1, xx 2 1 P XX 1 xx 1 P XX 2 xx 2 = 1 1 FF 1 xx 1 (1 FF 2 (xx 2 )) = FF 1 xx 1 + FF 2 xx 2 1 0. For the upper bound see exercises. For >2 random variables see notes.

Achieving the Upper Bound: A Random Vector whose CDF is Upper Bound Page 12 Upper Bound min{ff 1 xx 1, FF 2 xx 2 } is a cdf for a random vector X [FF 1 1 UU, FF 1 2 (UU)] where UU is a uniform rv over (0,1). FF 1 xx 1 FF 2 xx 2 is a cdf for a random vector [FF 1 1 UU 1, FF 2 1 (UU 2 )] where UU 1, UU 2 are iid uniform rv over (0,1). FF 1 FF 2 UU FF 1 FF 2 UU 1 UU 2 XX 1 = FF 1 1 UU XX 2 = FF 2 1 (UU) FF 1 1 UU 1 FF 2 1 (UU 2 ) FF XX xx 1, xx 2 = P FF 1 1 UU 1 xx 1, FF 2 1 UU 2 xx 2 FF XX xx 1, xx 2 = P FF 1 1 UU xx 1, FF 2 1 UU xx 2 = P UU FF 1 xx 1, UU FF 2 xx 2 = P UU min FF 1 xx 1, FF 2 xx 2 = min{ff 1 xx 1, FF 2 xx 2 } = P UU 1 FF 1 xx 1, UU 2 FF 2 xx 2 = P UU 1 FF 1 xx 1 )P(UU 2 FF 2 xx 2 = FF 1 xx 1 FF 2 xx 2 min{ff 1 xx 1, FF 2 xx 2 } See Exercises for achieving the lower bound.

Other Methods for Representing Dependence Achieving Unique Joint Probability: Copula Page 13 Given marginals FF 1, FF 2, FF XX RR{FF 1, FF 2 } is not unique for XX = [XX 1, XX 2 ]. Use a copula function CC: 0,1 0,1 [0,1] to relate joint cdf to the marginals FF XX (xx 1, xx 2 ) = CC(FF 1 (xx 1 ), FF 2 xx 2 ) Copula describes the dependence structure when random variables are together, whereas marginals describe the behavior of each random variable on its own. Ex: Independence copula: CC uu 1, uu 2 = uu 1 uu 2. Comonotonic copula: CC uu 1, uu 2 = min{uu 1, uu 2 } Copula function must satisfy Non-decreasing & right-continuous cdf non-decreasing and right-continuous copula function CC 0- & 1-probability for cdf lim uu1 0 CC(uu 1, uu 2 )= lim uu2 0 CC(uu 1, uu 2 )=0, lim uu1 1 CC uu 1, uu 2 = uu 2, lim uu2 1 CC uu 1, uu 2 = uu 1 Submodularity for cdf CC vv 1, vv 2 + CC uu 1, uu 2 CC uu 1, vv 2 + CC vv 1, uu 2 for uu 1, uu 2 vv 1, vv 2 Sklar s theorem: Copula approach does not miss anything. Each FF XX RR{FF 1, FF 2 } is representable with a unique copula. Given the joint cdf FF XX for vector XX = [XX 1, XX 2 ], there is a Copula CC satisfying FF XX (xx 1, xx 2 ) = CC(FF 1 (xx 1 ), FF 2 xx 2 ). This copula is the joint cdf of the random vector XX FF = [FF 1 XX 1, FF 2 (XX 2 )]. FF 1 (XX 1 ) is a random variable because it is a function of random variable.» The cdf of FF 1 (XX 1 ) is FF FF1 XX 1 aa P FF 1 XX 1 aa = P XX 1 FF 1 1 aa = FF 1 FF 1 1 aa = aa, so FF 1 (XX 1 ) is a uniform random variable. Marginals of XX FF are uniform. The cdf of XX FF is FF XXFF aa, bb : = P(FF 1 (XX 1 ) aa, FF 2 (XX 2 ) bb) = P XX 1 FF 1 1 aa, XX 2 FF 1 2 bb = FF XX FF 1 1 aa, FF 1 2 bb = CC FF 1 FF 1 1 aa, FF 2 FF 1 2 bb = CC(aa, bb)

Comonotonic Copula Page 14 For two monotone increasing (or decreasing) functions gg 1, gg 2 and a random variable YY If XX 1 = gg 1 (YY) and XX 2 = gg 2 (YY), then XX 1 and XX 2 are comonotonic. Ex: Comonotonic copula CC uu 1, uu 2 = min uu 1, uu 2 yields comonotonic rvs. With the comonotonic copula XX = [XX 1, XX 2 ] has the joint cdf FF XX xx 1, xx 2 = CC FF XX1 xx 1, FF XX2 xx 2 = min FF XX1 xx 1, FF XX2 xx 2 From earlier discussion, min FF XX1 xx 1, FF XX2 xx 2 is the cdf of XX = [FF XX1 UU, FF XX2 UU ] for uniformly distributed UU over [0,1]. Letting gg 1 uu = FF 1 XX1 uu and gg 2 uu = FF 1 XX2 uu XX 1 = gg 1 UU and XX 2 = gg 2 UU. and YY = UU, we see that XX 1 and XX 2 are comonotonic.

Marshall-Olkin Bivariate Exponential Copula Page 15 For independent YY 1 = EEEEEEEE λλ 1, YY 2 = EEEEEEEE λλ 2, ZZ = EEEEEEEE λλ, consider the vector XX = XX 1, XX 2 = [min YY 1, ZZ, min{yy 2, ZZ}] YY 1, YY 2 lifetimes of two necessary components of a system that receives a fatal shock at ZZ Tail probability for XX ii : FF ii (xx ii ) = P XX ii xx ii = P min{yy ii, ZZ} xx ii = P YY ii xx ii )P(ZZ xx ii = exp λλ ii + λλ xx ii Tail probability for XX: FF XX (xx 1, xx 2 ) = P XX 1 xx 1, XX 2 xx 2 = P YY 1 xx 1, YY 2 xx 2, ZZ max{xx 1, xx 2 } = P YY 1 xx 1 )P(YY 2 xx 2 )P(ZZ max{xx 1, xx 2 } = exp( λλ 1 xx 1 ) exp( λλ 2 xx 2 ) exp λλmax xx 1, xx 2 = exp( (λλ 1 + λλ)xx 1 ) exp( (λλ 2 +λλ)xx 2 ) min{exp λλxx 1, exp(λλxx 2 )} = FF 1 (xx 1 ) FF 2 (xx 2 )min{exp λλxx 1, exp(λλxx 2 )} where max aa, bb = aa + bb min{aa, bb} and monotonicity of exp(xx) are used respectively in the last two equalities. The cdf for XX: FF XX xx 1, xx 2 = FF 1 xx 1 + FF 2 xx 2 + FF XX xx 1, xx 2 1 = 1 FF 1 xx 1 + 1 FF 2 xx 2 + FF 1 (xx 1 ) FF 2 (xx 2 )min{exp λλxx 1, exp(λλxx 2 )} 1 = 1 FF 1 xx 1 FF 2 xx 2 + FF 1 (xx 1 ) FF 2 (xx 2 )min{ FF 1 xx 1 λλ/(λλ 1 +λλ), FF 2 xx 2 λλ/(λλ 2 +λλ) } = 1 FF 1 xx 1 FF 2 xx 2 +min{ FF 2 xx 2 FF 1 xx 1 1 λλ/(λλ 1 +λλ), FF 1 xx 1 FF 2 xx 2 1 λλ/(λλ 2 +λλ) } 1 λλ 1 = 1 (1 FF 1 ) (1 FF 2 ) +min{(1 FF 2 ) 1 FF λλ1+λλ 1, (1 FF 1 ) 1 FF 2 =: CC(FF 1, FF 2 ) This holds for Marshall-Olkin copula CC: 0,1 0,1 [0,1] and 0 αα ii = λλ λλ ii +λλ 1 FF λλ ii xx λλ ii ii +λλ λλ = exp λλ ii + λλ xx λλ ii ii +λλ = exp(λλxx ii ) λλ λλ2+λλ } CC uu 1, uu 2 = 1 1 uu 1 1 uu 2 + min{ 1 uu 2 1 uu 1 1 αα 1, 1 uu 1 1 uu 2 1 αα 2 }

Summary Page 16 Random Vectors Measurement of Dependence: Covariance Other Methods of Representing Dependence Set of Joint Distributions Copulas Common Random Vectors Functions of Random Vectors One-to-one mappings Many-to-one mappings