Introduction Outline Introduction Copula Specification Heuristic Example Simple Example The Copula perspective Mutual Information as Copula dependent
|
|
- Gwenda Ellis
- 6 years ago
- Views:
Transcription
1 Copula Based Independent Component Analysis SAMSI 2008 Abayomi, Kobi + + SAMSI 2008 April 2008
2 Introduction Outline Introduction Copula Specification Heuristic Example Simple Example The Copula perspective Mutual Information as Copula dependent function MI KL Independence PCA as a special case Implementation on ESI Inputs CICA outputs Results Next
3 Introduction Copula Specification A copula is a distribution function... Take X 1 F X1, X 2 F X2
4 Introduction Copula Specification...on the marginal distributions of random variables Set U = F X1 and V = F X2. The pair (U, V ) are the grades of (X 1, X 2 ) i.e. the mapping of (X 1, X 2 ) in F X1, F X2 space.
5 Introduction Copula Specification...on the marginal distributions of random variables Set U = F X1 and V = F X2. The pair (U, V ) are the grades of (X 1, X 2 ) i.e. the mapping of (X 1, X 2 ) in F X1, F X2 space.
6 Introduction Copula Specification Copula specification A copula is a function that takes the grades as arguments and returns a joint distribution function, with marginals F X1, F X2. C(U, V )=F X1,X 2
7 Introduction Copula Specification Copula generation Any multivariate distribution function can yield a copula function. F X1,X 2 (F 1 X 1 (U), F 1 X 2 (V )) = C (U, V )
8 Introduction Copula Specification Heuristically speaking The correspondence which assigns the value of the joint distribution function to each ordered pair of values (F X1, F X2 ) for each X 1, X 2 is a distribution function called a Copula.
9 Introduction Heuristic Example GPA - a copula based function
10 Introduction Simple Example Simple Example: Bivariate Distribution H θ (x, y) =1 e x e y + e (x+y+θxy) for x, y R +. H = 0 otherwise.
11 Introduction Simple Example Simple Example: Bivariate Distribution H ( 1) X (u) = ln (1 u); H ( 1) Y (v) = ln (1 v)
12 Introduction Simple Example Simple Example: Bivariate Distribution C θ (u, v) =H( ln(1 u), ln(1 v)) = θ ln(1 u) ln(1 v) =(u + v 1)+(1 u)(1 v) e Notice if θ = 0 C θ (u, v) =uv...the independence copula.
13 Introduction Simple Example Bivariate Copula θ = 0, 1, 5
14 Introduction The Copula perspective Why use a Copula Copulas are useful when modelling: Multivariate settings where a different family is needed for each marginal distribution. Parametric estimates/versions of measures of association.
15 Introduction The Copula perspective Why use a Copula Copulas are useful when modelling: Multivariate settings where a different family is needed for each marginal distribution. Parametric estimates/versions of measures of association.
16 Introduction The Copula perspective Copulas dependent measures of association Many measures of association can be expressed as solely copula dependent. Kendalls Tau and, for example, Spearman s Rho with U = F X, V = F Y : ρ (X,Y ) = 12 C(u, v)dudv 3 ρ (U,V ) = 12E(C(u, v)) 3
17 Mutual Information as Copula dependent function Outline Introduction Copula Specification Heuristic Example Simple Example The Copula perspective Mutual Information as Copula dependent function MI KL Independence PCA as a special case Implementation on ESI Inputs CICA outputs Results Next
18 Mutual Information as Copula dependent function Copula Density With u =(F X1,..., F Xk ), the copula density dc(u) is dc(u) = df x(x) dfxi (x i )
19 Mutual Information as Copula dependent function Mutual Information It turns out the mutual information MI(x) df X log( df X ) R k dfxi is solely copula dependent...
20 Mutual Information as Copula dependent function Mutual Information is Copula based Since, with u =(F X1,..., F Xk ) MI(x) dc(u)log(dc(u)) I k Thus the MI is a copula based measure of association.
21 Mutual Information as Copula dependent function MI KL Independence Mutual Information MI is often used as a proxy for independence in general (i.e. non gaussian) settings. MI is the Kullback-Liebler divergence ( distance ) between dependence and independence. MI= 0 implies independence.
22 Mutual Information as Copula dependent function MI KL Independence Mutual Information MI is often used as a proxy for independence in general (i.e. non gaussian) settings. MI is the Kullback-Liebler divergence ( distance ) between dependence and independence. MI= 0 implies independence.
23 Mutual Information as Copula dependent function MI KL Independence Mutual Information MI is often used as a proxy for independence in general (i.e. non gaussian) settings. MI is the Kullback-Liebler divergence ( distance ) between dependence and independence. MI= 0 implies independence.
24 Outline Introduction Copula Specification Heuristic Example Simple Example The Copula perspective Mutual Information as Copula dependent function MI KL Independence PCA as a special case Implementation on ESI Inputs CICA outputs Results Next
25 PCA as a special case The PCA model Given multivariate data x k with scatter matrix Σ PCA program: Find B such that y = Bx yields uncorrelated y i and y j.
26 PCA as a special case The PCA model Given multivariate data x k with scatter matrix Σ PCA program: Find B such that y = Bx yields uncorrelated y i and y j.
27 PCA as a special case The PCA model Well known result: Singular Value Decomposition (SVD) Σ= e t Λe Yielding: with Cov(y i, y j )=0, i j. y i = e t x
28 PCA as a special case The PCA model - mixed data
29 PCA as a special case The PCA model - rotated data
30 (ICA) ICA is the PCA program under the more general assumption of statistical independence Given x k ICA program: Find y = Bx such that y i = b i x are independent of y j = b j x.
31 (ICA) ICA is the PCA program under the more general assumption of statistical independence Given x k ICA program: Find y = Bx such that y i = b i x are independent of y j = b j x.
32 (ICA) In ICA: x = As the observed data are the mixed outputs, B, of independent sources s. The y are the estimates (ŝ) of these independent components, or signals. B is an estimate of A 1 ; B = A ˆ 1.
33 (ICA) In ICA: x = As the observed data are the mixed outputs, B, of independent sources s. The y are the estimates (ŝ) of these independent components, or signals. B is an estimate of A 1 ; B = A ˆ 1.
34 (ICA) In ICA: x = As the observed data are the mixed outputs, B, of independent sources s. The y are the estimates (ŝ) of these independent components, or signals. B is an estimate of A 1 ; B = A ˆ 1.
35 The ICA model - illustration
36 The ICA model
37 Comments on ICA In both PCA and ICA the objective is the recovery of the mixing A of the independent signals x. The difference is the characterization of statistical independence. True statistical independence requires factorization of probability densities. ICA procedures often use high order moments, or empirical mutual information as independence proxies. Most are non-parametric approaches to estimating statistical independence.
38 Comments on ICA In both PCA and ICA the objective is the recovery of the mixing A of the independent signals x. The difference is the characterization of statistical independence. True statistical independence requires factorization of probability densities. ICA procedures often use high order moments, or empirical mutual information as independence proxies. Most are non-parametric approaches to estimating statistical independence.
39 Comments on ICA In both PCA and ICA the objective is the recovery of the mixing A of the independent signals x. The difference is the characterization of statistical independence. True statistical independence requires factorization of probability densities. ICA procedures often use high order moments, or empirical mutual information as independence proxies. Most are non-parametric approaches to estimating statistical independence.
40 Comments on ICA In both PCA and ICA the objective is the recovery of the mixing A of the independent signals x. The difference is the characterization of statistical independence. True statistical independence requires factorization of probability densities. ICA procedures often use high order moments, or empirical mutual information as independence proxies. Most are non-parametric approaches to estimating statistical independence.
41 The ICA model
42 Copula Based (CICA) General Approach Replace non-parametric measures of dependence-independence with parametric copula families Appeal to the information theoretic distance - K-L divergence Exploit the role of the copula.
43 Copula Based (CICA) General Approach Replace non-parametric measures of dependence-independence with parametric copula families Appeal to the information theoretic distance - K-L divergence Exploit the role of the copula.
44 Copula Based (CICA) General Approach Replace non-parametric measures of dependence-independence with parametric copula families Appeal to the information theoretic distance - K-L divergence Exploit the role of the copula.
45 Bivariate Copula θ = 0, 1, 5
46 K-L divergence The Kullback-Liebler divergence between two probability density functions f (t) and g(t) is K(f, g) = f (t)log( f (t) g(t) ) (1) t I can (ab)use the notation K(w, z) for the divergence between the distribution of two random vectors w and z. K 0 with equality if and only if w and z have the same distribution.
47 Divergence decomposition A classic property (Kullback [1968], others) of (1) is K(y, s) =K(y, y )+K(y, s) (2) with y a random vector with independent entries and margins distributed as y; s is an independent vector.
48 Divergence decomposition The LHS K(y, s) the divergence of the ICA outputs (estimates) from the hypothesized inputs (sources). K(y, y ) is the independence-dependence of the outputs K(y, s) is the mismatch of the margins of the estimates from the margins of the sources.
49 Divergence decomposition The LHS K(y, s) the divergence of the ICA outputs (estimates) from the hypothesized inputs (sources). K(y, y ) is the independence-dependence of the outputs K(y, s) is the mismatch of the margins of the estimates from the margins of the sources.
50 Divergence decomposition The LHS K(y, s) the divergence of the ICA outputs (estimates) from the hypothesized inputs (sources). K(y, y ) is the independence-dependence of the outputs K(y, s) is the mismatch of the margins of the estimates from the margins of the sources.
51 L maximization implies KL minimization Model x as generated from As = x. Log likelihood for n independent samples of x, under our estimate ˆq of the true source distribution is: L A = n 1 log(ˆq(a 1 x)) log( deta ) Which converges to q( )log(ˆq( )) + cst. Which can be rewritten: q( )log(q( )) q( )log( q( ) )=H(y) K (q( ), ˆq( )) ˆq( )
52 L maximization implies KL minimization Model x as generated from As = x. Log likelihood for n independent samples of x, under our estimate ˆq of the true source distribution is: L A = n 1 log(ˆq(a 1 x)) log( deta ) Which converges to q( )log(ˆq( )) + cst. Which can be rewritten: q( )log(q( )) q( )log( q( ) )=H(y) K (q( ), ˆq( )) ˆq( )
53 L maximization implies KL minimization Model x as generated from As = x. Log likelihood for n independent samples of x, under our estimate ˆq of the true source distribution is: L A = n 1 log(ˆq(a 1 x)) log( deta ) Which converges to q( )log(ˆq( )) + cst. Which can be rewritten: q( )log(q( )) q( )log( q( ) )=H(y) K (q( ), ˆq( )) ˆq( )
54 ML maximization implies KL minimization = H(y) K(y, s) =H(x) K(y, s) The RHS first term is the entropy of the inputs; the RHS second term is the distance between (true) sources and their estimates. ML maximization is equivalent to minimization of the KL distance between the outputs and the sources.
55 Copula Based (CICA) The procedure is to recast (2) via the copula: K(y, y )=MI(y) = H(dC(u)) if u are the marginal distribution functions u i = F i (y i ). K(y, s) = K(y i, s i), since both y, s have independent entries.
56 Copula Based (CICA) The procedure is to recast (2) via the copula: K(y, y )=MI(y) = H(dC(u)) if u are the marginal distribution functions u i = F i (y i ). K(y, s) = K(y i, s i), since both y, s have independent entries.
57 Copula Based (CICA) Under fixed assumptions about the distribution of the sources, I have to minimize two terms: the true objective, the mutual information, expressed via the copula; the mismatch of the marginal distributions to the assumed distributions.
58 Copula Based Component Analysis - Summary To cast the K-L partitioning in terms of the copula I write the independence term as min B and the marginal fit term as MI(y; B) =min E q (log(dc Θ (u))) B min Θ [C Θ (u) k (u i )]. That is, I minimize the mutual information via the copula via rotation B = Â 1 after minimizing the distance between parametric copula and independent marginals. i=1
59 Copula Based Component Analysis - Overview Setting u = G(y ) where y is still a random, mutually independent vector with margins distributed equivalently with y. Thus, u is independent with margins distributed as y. K(û, u) =K(û, u )+K(u, û) with û the estimate of the true sources output from a copula based procedure and u the true distribution of the sources. The K-L distance between the outputs and the sources is then: (1) the fit of the outputs to independence K(û, u ); and (2) the fit of the marginals of the outputs to the true source distributions.
60 Copula Based Component Analysis - Full Model An approach is to minimize the mutual information of the outputs, with the distributional assumption either fixed or parameterized, exploiting the copula representation: min B MI(y; B) =min Eˆq (log(dc Θ (u))) B
61 Copula Based Component Analysis - Full Model This is equivalent to maximizing the score L B = K(q( ), ˆq(, B)) B via the marginal distributions L B = K(û, u) B using the copula model.
62 Copula Based Component Analysis - Full Model The mixing matrix estimate B can be recovered via gradient ascent/descent, for example. Versions where the joint dependence is captured in a single parameter (multivariate or scalar) may not be applicable for the ICA problem. If Θ is the copula parameter at independence then lim Θ Θ C Θ (u) = k i u i and the mixing matrix at Θ is unidentifiable.
63 Copula Based Component Analysis - Partite Model Another approach is to set y = RW x, with W a whitening matrix - the product of PCA - and R the ICA rotation. This allows orthogonalization of a mutual information matrix via well known procedures like Singular Value Decomposition. Construct scatter/kernel matrix Γ CΘ = ((C θij (ˆF n w i T x (w it x), ˆF n w j T x (w j T x)))) i,j=1..k Find orthogonalization of Γ CΘ, λ 1,..., λ k Yield y k = b k x k = r k w k x k with y i y j via C Θ
64 Copula Based Component Analysis - Partite Model Another approach is to set y = RW x, with W a whitening matrix - the product of PCA - and R the ICA rotation. This allows orthogonalization of a mutual information matrix via well known procedures like Singular Value Decomposition. Construct scatter/kernel matrix Γ CΘ = ((C θij (ˆF n w i T x (w it x), ˆF n w j T x (w j T x)))) i,j=1..k Find orthogonalization of Γ CΘ, λ 1,..., λ k Yield y k = b k x k = r k w k x k with y i y j via C Θ
65 Copula Based Component Analysis - Partite Model Another approach is to set y = RW x, with W a whitening matrix - the product of PCA - and R the ICA rotation. This allows orthogonalization of a mutual information matrix via well known procedures like Singular Value Decomposition. Construct scatter/kernel matrix Γ CΘ = ((C θij (ˆF n w i T x (w it x), ˆF n w j T x (w j T x)))) i,j=1..k Find orthogonalization of Γ CΘ, λ 1,..., λ k Yield y k = b k x k = r k w k x k with y i y j via C Θ
66 Copula Based Component Analysis - Partite Model Choose exchangeable families at each bivariate pair. Treat the univariate distributions u i = F Xi (x i ) as observed. Bivariate Mutual Information(s), or, E(log(dC(u i, v i ))) are the elements of scatter matrix
67 Copula Based Component Analysis - Partite Model Choose exchangeable families at each bivariate pair. Treat the univariate distributions u i = F Xi (x i ) as observed. Bivariate Mutual Information(s), or, E(log(dC(u i, v i ))) are the elements of scatter matrix
68 Copula Based Component Analysis - Partite Model Choose exchangeable families at each bivariate pair. Treat the univariate distributions u i = F Xi (x i ) as observed. Bivariate Mutual Information(s), or, E(log(dC(u i, v i ))) are the elements of scatter matrix
69 Copula Based Component Analysis - Partite Model For well fit Cˆθij MI(Cˆθij ) 0 for all i, j. Thus R = ((MI(Cˆθij ))) is positive semi-definite, by exchangeability. Singular Value Decomposition of R yields orthogonal basis (w.r.t MI) [Tipping and Bishop 1999].
70 Copula Based Component Analysis - Partite Model For well fit Cˆθij MI(Cˆθij ) 0 for all i, j. Thus R = ((MI(Cˆθij ))) is positive semi-definite, by exchangeability. Singular Value Decomposition of R yields orthogonal basis (w.r.t MI) [Tipping and Bishop 1999].
71 Copula Based Component Analysis - Partite Model For well fit Cˆθij MI(Cˆθij ) 0 for all i, j. Thus R = ((MI(Cˆθij ))) is positive semi-definite, by exchangeability. Singular Value Decomposition of R yields orthogonal basis (w.r.t MI) [Tipping and Bishop 1999].
72 Copula Based Component Analysis - Partite Model Some ICA algorithms employ arbitrary non-linear transforms (e.g. logistic [Bell and Sejnowski 1995], log(cosh) [Teschendorf 2004]). Other algorithms use nonparametric estimates of MI [Stogbauer 2004] or cumulant moments [Comon 1994].
73 Copula Based Component Analysis - Partite Model Some ICA algorithms employ arbitrary non-linear transforms (e.g. logistic [Bell and Sejnowski 1995], log(cosh) [Teschendorf 2004]). Other algorithms use nonparametric estimates of MI [Stogbauer 2004] or cumulant moments [Comon 1994].
74 Copula Based Component Analysis - following 3, Comparisons Copula approach allows flexible choice of non-linear u. Parametric estimators of MI (O(n 1/2 )) superior to nonparametric version (O(n 4/5 )). Parametric approach may be more stable on smaller datasets and under moment perturbation [McCullagh 1994, Everson and Roberts 2000]; and on extreme value distributions.
75 Copula Based Component Analysis - following 3, Comparisons Copula approach allows flexible choice of non-linear u. Parametric estimators of MI (O(n 1/2 )) superior to nonparametric version (O(n 4/5 )). Parametric approach may be more stable on smaller datasets and under moment perturbation [McCullagh 1994, Everson and Roberts 2000]; and on extreme value distributions.
76 Copula Based Component Analysis - following 3, Comparisons Copula approach allows flexible choice of non-linear u. Parametric estimators of MI (O(n 1/2 )) superior to nonparametric version (O(n 4/5 )). Parametric approach may be more stable on smaller datasets and under moment perturbation [McCullagh 1994, Everson and Roberts 2000]; and on extreme value distributions.
77 Example Results 1 - Extreme value distribution.
78 Example Results 1 - Rotated extreme value distribution.
79 Example Results 1; Gumbel-Hougard (GH) type copula - y i = B i x i
80 Example Results 2 - GH type dependency gradient.
81 Example Results 3 - CICA vs. fastica
82 Sample Size Comparison, log(mise) - CICA(GH) vs. fastica
83 Partite Example Finding closed form MLE estimators is difficult for many models. Balance between tractable MLE estimation and model flexibility.
84 Partite Example Finding closed form MLE estimators is difficult for many models. Balance between tractable MLE estimation and model flexibility.
85 Two Parameter Archimedean Families A copula family is called archimedean if the copula can be written: C δ (u, v) =φ δ (φ 1 δ (u)+φ 1 (v)) with φ δ a generating function parameterized by δ. δ
86 Two Parameter Archimedean Families Then C θ,δ (u, v) =η θ,δ (η 1 θ,δ (u)+η 1 θ,δ (v)) with η θ,δ (s) =ψ θ ( logφ δ (s)) is a natural two-parameter extension
87 Application: CICA on ESI CICA on the 2002 Environmental Sustainability Index (ESI) A scaled linear combination of sixty-eight metrics of environmental concern Traditional quantities (such as NO x and SO 2 concentrations) are included with more expansive measures of environmental sustainability -(such as civil liberty and corruption) A measure of overall progress towards environmental sustainability - designed to permit systematic and quantitative comparison between nations
88 Application: CICA on ESI CICA on the 2002 Environmental Sustainability Index (ESI) A scaled linear combination of sixty-eight metrics of environmental concern Traditional quantities (such as NO x and SO 2 concentrations) are included with more expansive measures of environmental sustainability -(such as civil liberty and corruption) A measure of overall progress towards environmental sustainability - designed to permit systematic and quantitative comparison between nations
89 Application: CICA on ESI CICA on the 2002 Environmental Sustainability Index (ESI) A scaled linear combination of sixty-eight metrics of environmental concern Traditional quantities (such as NO x and SO 2 concentrations) are included with more expansive measures of environmental sustainability -(such as civil liberty and corruption) A measure of overall progress towards environmental sustainability - designed to permit systematic and quantitative comparison between nations
90 ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
91 ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
92 ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
93 ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
94 ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
95 The complete data ESI
96 Implementation on ESI Partite Approach The approach: Exploit the empirical distribution, setting u i = ˆF n i (x i ) and treating the univariate marginals as observed or unparameterized. Fit copulas pairwise and minimize MI(u) by diagonalization of a mutual information matrix.
97 Implementation on ESI Partite Approach The approach: Exploit the empirical distribution, setting u i = ˆF n i (x i ) and treating the univariate marginals as observed or unparameterized. Fit copulas pairwise and minimize MI(u) by diagonalization of a mutual information matrix.
98 Implementation on ESI Implementation Fit bivariate copula The candidate copulae at two-parameter extensions of Archimedean type copulae.
99 Implementation on ESI Implementation Fit bivariate copula The candidate copulae at two-parameter extensions of Archimedean type copulae.
100 Implementation on ESI Implementation Additionally, I fit rotations of bivariate copula. The copula are rotated by setting the argument equal to the values in the table, with u = 1 u, v = 1 v: Rotation (u, v) 0 (u, v) 90 (v, u) 180 (v, u) 270 (v, u)
101 Inputs Bivariate Scatterplot - PCA 1 vs. PCA 2
102 Inputs Bivariate Scatterplot - PCA 2 vs. PCA 3
103 Inputs Bivariate Scatterplot - PCA 1 vs. PCA 3
104 Inputs Trivariate Scatterplot - PCA 1 vs. PCA 2 vs PCA 3
105 CICA outputs Contour/Scatter plot: Copula on CICA 1 vs. CICA 2
106 CICA outputs Contour/Scatter plot: Copula on CICA 2 vs. CICA 3
107 CICA outputs Contour/Scatter plot: Copula on CICA 1 vs. CICA 3
108 Results Image plots: Covariance vs. MI
109 Results Scree plot: PCA and CICA
110 Results CICA loadings Variable Name Component 1 Component 2 Component 3 SO NO TSP ISO WATCAP IUCN CO2GDP
111 Results CICA loadings vs. PCA loadings CICA SO2 NO2 TSP ISO14 WATCAP IUCN CO2GDP PCA NUKE BODWAT TFR FSHCAT PESTHA WATSUP GRAFT
112 Results ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
113 Results ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
114 Results ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
115 Results ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
116 Results ESI Grouping Environmental Systems Natural stocks such as air, soil, and water. Environmental Stresses Stress on ecosystems such as pollution and deforestation. Vulnerability Basic needs such as health, nutrition, and mortality. Capacity Social and economic variables such as corruption and liberty, energy consumption, and schooling rate. Stewardship Global cooperation such as treaty participation and compliance.
117 Next Outline Introduction Copula Specification Heuristic Example Simple Example The Copula perspective Mutual Information as Copula dependent function MI KL Independence PCA as a special case Implementation on ESI Inputs CICA outputs Results Next
118 Next Comments Copula approach unifies PCA/ICA as likelihood based approach Natural first step of generalized (linear) Copula based models Exploits the rich family of lower dimension copulae
119 Next Comments Copula approach unifies PCA/ICA as likelihood based approach Natural first step of generalized (linear) Copula based models Exploits the rich family of lower dimension copulae
120 Next Comments Copula approach unifies PCA/ICA as likelihood based approach Natural first step of generalized (linear) Copula based models Exploits the rich family of lower dimension copulae
121 Next Comments, Next Steps
122 Next Comments, Next Steps
Copula Based Independent Component Analysis
Copula Based Independent Component Analysis CUNY October 2008 Kobi Abayomi + + Asst. Professor Industrial Engineering - Statistics Group Georgia Institute of Technology October 2008 Introduction Outline
More informationFinancial Econometrics and Volatility Models Copulas
Financial Econometrics and Volatility Models Copulas Eric Zivot Updated: May 10, 2010 Reading MFTS, chapter 19 FMUND, chapters 6 and 7 Introduction Capturing co-movement between financial asset returns
More informationIndependent Component Analysis
1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S
More informationA measure of radial asymmetry for bivariate copulas based on Sobolev norm
A measure of radial asymmetry for bivariate copulas based on Sobolev norm Ahmad Alikhani-Vafa Ali Dolati Abstract The modified Sobolev norm is used to construct an index for measuring the degree of radial
More informationCopulas. MOU Lili. December, 2014
Copulas MOU Lili December, 2014 Outline Preliminary Introduction Formal Definition Copula Functions Estimating the Parameters Example Conclusion and Discussion Preliminary MOU Lili SEKE Team 3/30 Probability
More informationMassoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39
Blind Source Separation (BSS) and Independent Componen Analysis (ICA) Massoud BABAIE-ZADEH Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Outline Part I Part II Introduction
More informationMULTIDIMENSIONAL POVERTY MEASUREMENT: DEPENDENCE BETWEEN WELL-BEING DIMENSIONS USING COPULA FUNCTION
Rivista Italiana di Economia Demografia e Statistica Volume LXXII n. 3 Luglio-Settembre 2018 MULTIDIMENSIONAL POVERTY MEASUREMENT: DEPENDENCE BETWEEN WELL-BEING DIMENSIONS USING COPULA FUNCTION Kateryna
More informationModelling Dependence with Copulas and Applications to Risk Management. Filip Lindskog, RiskLab, ETH Zürich
Modelling Dependence with Copulas and Applications to Risk Management Filip Lindskog, RiskLab, ETH Zürich 02-07-2000 Home page: http://www.math.ethz.ch/ lindskog E-mail: lindskog@math.ethz.ch RiskLab:
More informationCopula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011
Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models
More informationA Conditional Approach to Modeling Multivariate Extremes
A Approach to ing Multivariate Extremes By Heffernan & Tawn Department of Statistics Purdue University s April 30, 2014 Outline s s Multivariate Extremes s A central aim of multivariate extremes is trying
More informationMultivariate Distribution Models
Multivariate Distribution Models Model Description While the probability distribution for an individual random variable is called marginal, the probability distribution for multiple random variables is
More informationDependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.
Practitioner Course: Portfolio Optimization September 10, 2008 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y ) (x,
More informationCalibration Estimation of Semiparametric Copula Models with Data Missing at Random
Calibration Estimation of Semiparametric Copula Models with Data Missing at Random Shigeyuki Hamori 1 Kaiji Motegi 1 Zheng Zhang 2 1 Kobe University 2 Renmin University of China Econometrics Workshop UNC
More informationDependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.
MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y
More informationMultivariate Statistics
Multivariate Statistics Chapter 2: Multivariate distributions and inference Pedro Galeano Departamento de Estadística Universidad Carlos III de Madrid pedro.galeano@uc3m.es Course 2016/2017 Master in Mathematical
More informationBivariate Rainfall and Runoff Analysis Using Entropy and Copula Theories
Entropy 2012, 14, 1784-1812; doi:10.3390/e14091784 Article OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Bivariate Rainfall and Runoff Analysis Using Entropy and Copula Theories Lan Zhang
More informationConstruction and estimation of high dimensional copulas
Construction and estimation of high dimensional copulas Gildas Mazo PhD work supervised by S. Girard and F. Forbes Mistis, Inria and laboratoire Jean Kuntzmann, Grenoble, France Séminaire Statistiques,
More informationA Brief Introduction to Copulas
A Brief Introduction to Copulas Speaker: Hua, Lei February 24, 2009 Department of Statistics University of British Columbia Outline Introduction Definition Properties Archimedean Copulas Constructing Copulas
More informationProbability Distributions and Estimation of Ali-Mikhail-Haq Copula
Applied Mathematical Sciences, Vol. 4, 2010, no. 14, 657-666 Probability Distributions and Estimation of Ali-Mikhail-Haq Copula Pranesh Kumar Mathematics Department University of Northern British Columbia
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationA copula goodness-of-t approach. conditional probability integral transform. Daniel Berg 1 Henrik Bakken 2
based on the conditional probability integral transform Daniel Berg 1 Henrik Bakken 2 1 Norwegian Computing Center (NR) & University of Oslo (UiO) 2 Norwegian University of Science and Technology (NTNU)
More informationCIFAR Lectures: Non-Gaussian statistics and natural images
CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity
More informationLecture 4: Principal Component Analysis and Linear Dimension Reduction
Lecture 4: Principal Component Analysis and Linear Dimension Reduction Advanced Applied Multivariate Analysis STAT 2221, Fall 2013 Sungkyu Jung Department of Statistics University of Pittsburgh E-mail:
More informationA New Generalized Gumbel Copula for Multivariate Distributions
A New Generalized Gumbel Copula for Multivariate Distributions Chandra R. Bhat* The University of Texas at Austin Department of Civil, Architectural & Environmental Engineering University Station, C76,
More informationModelling Dependent Credit Risks
Modelling Dependent Credit Risks Filip Lindskog, RiskLab, ETH Zürich 30 November 2000 Home page:http://www.math.ethz.ch/ lindskog E-mail:lindskog@math.ethz.ch RiskLab:http://www.risklab.ch Modelling Dependent
More informationCalibration Estimation of Semiparametric Copula Models with Data Missing at Random
Calibration Estimation of Semiparametric Copula Models with Data Missing at Random Shigeyuki Hamori 1 Kaiji Motegi 1 Zheng Zhang 2 1 Kobe University 2 Renmin University of China Institute of Statistics
More informationIndependent Component Analysis and Its Applications. By Qing Xue, 10/15/2004
Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework
More informationOn the Choice of Parametric Families of Copulas
On the Choice of Parametric Families of Copulas Radu Craiu Department of Statistics University of Toronto Collaborators: Mariana Craiu, University Politehnica, Bucharest Vienna, July 2008 Outline 1 Brief
More informationThe Instability of Correlations: Measurement and the Implications for Market Risk
The Instability of Correlations: Measurement and the Implications for Market Risk Prof. Massimo Guidolin 20254 Advanced Quantitative Methods for Asset Pricing and Structuring Winter/Spring 2018 Threshold
More informationVariational Principal Components
Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings
More informationDependence, correlation and Gaussianity in independent component analysis
Dependence, correlation and Gaussianity in independent component analysis Jean-François Cardoso ENST/TSI 46, rue Barrault, 75634, Paris, France cardoso@tsi.enst.fr Editor: Te-Won Lee Abstract Independent
More informationFrom independent component analysis to score matching
From independent component analysis to score matching Aapo Hyvärinen Dept of Computer Science & HIIT Dept of Mathematics and Statistics University of Helsinki Finland 1 Abstract First, short introduction
More informationMultivariate Distributions
IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate
More informationFuncICA for time series pattern discovery
FuncICA for time series pattern discovery Nishant Mehta and Alexander Gray Georgia Institute of Technology The problem Given a set of inherently continuous time series (e.g. EEG) Find a set of patterns
More informationGaussian Process Vine Copulas for Multivariate Dependence
Gaussian Process Vine Copulas for Multivariate Dependence José Miguel Hernández-Lobato 1,2 joint work with David López-Paz 2,3 and Zoubin Ghahramani 1 1 Department of Engineering, Cambridge University,
More informationSTATS 306B: Unsupervised Learning Spring Lecture 12 May 7
STATS 306B: Unsupervised Learning Spring 2014 Lecture 12 May 7 Lecturer: Lester Mackey Scribe: Lan Huong, Snigdha Panigrahi 12.1 Beyond Linear State Space Modeling Last lecture we completed our discussion
More informationMachine learning - HT Maximum Likelihood
Machine learning - HT 2016 3. Maximum Likelihood Varun Kanade University of Oxford January 27, 2016 Outline Probabilistic Framework Formulate linear regression in the language of probability Introduce
More informationChapter 4: Factor Analysis
Chapter 4: Factor Analysis In many studies, we may not be able to measure directly the variables of interest. We can merely collect data on other variables which may be related to the variables of interest.
More informationA Goodness-of-fit Test for Copulas
A Goodness-of-fit Test for Copulas Artem Prokhorov August 2008 Abstract A new goodness-of-fit test for copulas is proposed. It is based on restrictions on certain elements of the information matrix and
More information2234 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 52, NO. 8, AUGUST Nonparametric Hypothesis Tests for Statistical Dependency
2234 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 52, NO. 8, AUGUST 2004 Nonparametric Hypothesis Tests for Statistical Dependency Alexander T. Ihler, Student Member, IEEE, John W. Fisher, Member, IEEE,
More informationAn Improved Cumulant Based Method for Independent Component Analysis
An Improved Cumulant Based Method for Independent Component Analysis Tobias Blaschke and Laurenz Wiskott Institute for Theoretical Biology Humboldt University Berlin Invalidenstraße 43 D - 0 5 Berlin Germany
More informationMachine Learning Techniques for Computer Vision
Machine Learning Techniques for Computer Vision Part 2: Unsupervised Learning Microsoft Research Cambridge x 3 1 0.5 0.2 0 0.5 0.3 0 0.5 1 ECCV 2004, Prague x 2 x 1 Overview of Part 2 Mixture models EM
More informationTrivariate copulas for characterisation of droughts
ANZIAM J. 49 (EMAC2007) pp.c306 C323, 2008 C306 Trivariate copulas for characterisation of droughts G. Wong 1 M. F. Lambert 2 A. V. Metcalfe 3 (Received 3 August 2007; revised 4 January 2008) Abstract
More informationTime Varying Hierarchical Archimedean Copulae (HALOC)
Time Varying Hierarchical Archimedean Copulae () Wolfgang Härdle Ostap Okhrin Yarema Okhrin Ladislaus von Bortkiewicz Chair of Statistics C.A.S.E. Center for Applied Statistics and Economics Humboldt-Universität
More informationAdaptive estimation of the copula correlation matrix for semiparametric elliptical copulas
Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas Department of Mathematics Department of Statistical Science Cornell University London, January 7, 2016 Joint work
More informationFINM 331: MULTIVARIATE DATA ANALYSIS FALL 2017 PROBLEM SET 3
FINM 331: MULTIVARIATE DATA ANALYSIS FALL 2017 PROBLEM SET 3 The required files for all problems can be found in: http://www.stat.uchicago.edu/~lekheng/courses/331/hw3/ The file name indicates which problem
More informationData Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis
Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture
More informationFirst steps of multivariate data analysis
First steps of multivariate data analysis November 28, 2016 Let s Have Some Coffee We reproduce the coffee example from Carmona, page 60 ff. This vignette is the first excursion away from univariate data.
More informationCorrelation & Dependency Structures
Correlation & Dependency Structures GIRO - October 1999 Andrzej Czernuszewicz Dimitris Papachristou Why are we interested in correlation/dependency? Risk management Portfolio management Reinsurance purchase
More informationTail negative dependence and its applications for aggregate loss modeling
Tail negative dependence and its applications for aggregate loss modeling Lei Hua Division of Statistics Oct 20, 2014, ISU L. Hua (NIU) 1/35 1 Motivation 2 Tail order Elliptical copula Extreme value copula
More informationMACHINE LEARNING ADVANCED MACHINE LEARNING
MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 22 MACHINE LEARNING Discrete Probabilities Consider two variables and y taking discrete
More informationBayesian Multivariate Extreme Value Thresholding for Environmental Hazards
Bayesian Multivariate Extreme Value Thresholding for Environmental Hazards D. Lupton K. Abayomi M. Lacer School of Industrial and Systems Engineering Georgia Institute of Technology Institute for Operations
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationICA and ISA Using Schweizer-Wolff Measure of Dependence
Keywords: independent component analysis, independent subspace analysis, copula, non-parametric estimation of dependence Abstract We propose a new algorithm for independent component and independent subspace
More informationStatistical Data Mining and Machine Learning Hilary Term 2016
Statistical Data Mining and Machine Learning Hilary Term 2016 Dino Sejdinovic Department of Statistics Oxford Slides and other materials available at: http://www.stats.ox.ac.uk/~sejdinov/sdmml Naïve Bayes
More informationFitting Archimedean copulas to bivariate geodetic data
Fitting Archimedean copulas to bivariate geodetic data Tomáš Bacigál 1 and Magda Komorníková 2 1 Faculty of Civil Engineering, STU Bratislava bacigal@math.sk 2 Faculty of Civil Engineering, STU Bratislava
More informationLatent Variable Models and EM Algorithm
SC4/SM8 Advanced Topics in Statistical Machine Learning Latent Variable Models and EM Algorithm Dino Sejdinovic Department of Statistics Oxford Slides and other materials available at: http://www.stats.ox.ac.uk/~sejdinov/atsml/
More informationSemiparametric Gaussian Copula Models: Progress and Problems
Semiparametric Gaussian Copula Models: Progress and Problems Jon A. Wellner University of Washington, Seattle 2015 IMS China, Kunming July 1-4, 2015 2015 IMS China Meeting, Kunming Based on joint work
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind
More informationPROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata
' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--
More informationf-divergence Estimation and Two-Sample Homogeneity Test under Semiparametric Density-Ratio Models
IEEE Transactions on Information Theory, vol.58, no.2, pp.708 720, 2012. 1 f-divergence Estimation and Two-Sample Homogeneity Test under Semiparametric Density-Ratio Models Takafumi Kanamori Nagoya University,
More informationEstimation of multivariate critical layers: Applications to rainfall data
Elena Di Bernardino, ICRA 6 / RISK 2015 () Estimation of Multivariate critical layers Barcelona, May 26-29, 2015 Estimation of multivariate critical layers: Applications to rainfall data Elena Di Bernardino,
More information1 Joint and marginal distributions
DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested
More informationApplication of Copulas as a New Geostatistical Tool
Application of Copulas as a New Geostatistical Tool Presented by Jing Li Supervisors András Bardossy, Sjoerd Van der zee, Insa Neuweiler Universität Stuttgart Institut für Wasserbau Lehrstuhl für Hydrologie
More informationSemiparametric Gaussian Copula Models: Progress and Problems
Semiparametric Gaussian Copula Models: Progress and Problems Jon A. Wellner University of Washington, Seattle European Meeting of Statisticians, Amsterdam July 6-10, 2015 EMS Meeting, Amsterdam Based on
More informationIndependent Component (IC) Models: New Extensions of the Multinormal Model
Independent Component (IC) Models: New Extensions of the Multinormal Model Davy Paindaveine (joint with Klaus Nordhausen, Hannu Oja, and Sara Taskinen) School of Public Health, ULB, April 2008 My research
More informationSeries 7, May 22, 2018 (EM Convergence)
Exercises Introduction to Machine Learning SS 2018 Series 7, May 22, 2018 (EM Convergence) Institute for Machine Learning Dept. of Computer Science, ETH Zürich Prof. Dr. Andreas Krause Web: https://las.inf.ethz.ch/teaching/introml-s18
More informationSongklanakarin Journal of Science and Technology SJST R1 Sukparungsee
Songklanakarin Journal of Science and Technology SJST-0-0.R Sukparungsee Bivariate copulas on the exponentially weighted moving average control chart Journal: Songklanakarin Journal of Science and Technology
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationCalibration Estimation for Semiparametric Copula Models under Missing Data
Calibration Estimation for Semiparametric Copula Models under Missing Data Shigeyuki Hamori 1 Kaiji Motegi 1 Zheng Zhang 2 1 Kobe University 2 Renmin University of China Economics and Economic Growth Centre
More informationEstimation of Copula Models with Discrete Margins (via Bayesian Data Augmentation) Michael S. Smith
Estimation of Copula Models with Discrete Margins (via Bayesian Data Augmentation) Michael S. Smith Melbourne Business School, University of Melbourne (Joint with Mohamad Khaled, University of Queensland)
More informationMachine Learning (BSMC-GA 4439) Wenke Liu
Machine Learning (BSMC-GA 4439) Wenke Liu 02-01-2018 Biomedical data are usually high-dimensional Number of samples (n) is relatively small whereas number of features (p) can be large Sometimes p>>n Problems
More informationMultidimensional scaling (MDS)
Multidimensional scaling (MDS) Just like SOM and principal curves or surfaces, MDS aims to map data points in R p to a lower-dimensional coordinate system. However, MSD approaches the problem somewhat
More informationVariational Inference with Copula Augmentation
Variational Inference with Copula Augmentation Dustin Tran 1 David M. Blei 2 Edoardo M. Airoldi 1 1 Department of Statistics, Harvard University 2 Department of Statistics & Computer Science, Columbia
More informationCS281 Section 4: Factor Analysis and PCA
CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we
More informationUnsupervised learning: beyond simple clustering and PCA
Unsupervised learning: beyond simple clustering and PCA Liza Rebrova Self organizing maps (SOM) Goal: approximate data points in R p by a low-dimensional manifold Unlike PCA, the manifold does not have
More informationLECTURE NOTE #10 PROF. ALAN YUILLE
LECTURE NOTE #10 PROF. ALAN YUILLE 1. Principle Component Analysis (PCA) One way to deal with the curse of dimensionality is to project data down onto a space of low dimensions, see figure (1). Figure
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationIndependent Component Analysis and Unsupervised Learning
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien National Cheng Kung University TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent
More informationNatural Gradient Learning for Over- and Under-Complete Bases in ICA
NOTE Communicated by Jean-François Cardoso Natural Gradient Learning for Over- and Under-Complete Bases in ICA Shun-ichi Amari RIKEN Brain Science Institute, Wako-shi, Hirosawa, Saitama 351-01, Japan Independent
More informationIntroduction to Maximum Likelihood Estimation
Introduction to Maximum Likelihood Estimation Eric Zivot July 26, 2012 The Likelihood Function Let 1 be an iid sample with pdf ( ; ) where is a ( 1) vector of parameters that characterize ( ; ) Example:
More informationEE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015
EE613 Machine Learning for Engineers Kernel methods Support Vector Machines jean-marc odobez 2015 overview Kernel methods introductions and main elements defining kernels Kernelization of k-nn, K-Means,
More informationIndependent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego
Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)
More informationGoodness-of-Fit Testing with Empirical Copulas
with Empirical Copulas Sami Umut Can John Einmahl Roger Laeven Department of Econometrics and Operations Research Tilburg University January 19, 2011 with Empirical Copulas Overview of Copulas with Empirical
More informationRandom Matrix Eigenvalue Problems in Probabilistic Structural Mechanics
Random Matrix Eigenvalue Problems in Probabilistic Structural Mechanics S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html
More informationPrincipal Component Analysis. Applied Multivariate Statistics Spring 2012
Principal Component Analysis Applied Multivariate Statistics Spring 2012 Overview Intuition Four definitions Practical examples Mathematical example Case study 2 PCA: Goals Goal 1: Dimension reduction
More informationAn Introduction to Spectral Learning
An Introduction to Spectral Learning Hanxiao Liu November 8, 2013 Outline 1 Method of Moments 2 Learning topic models using spectral properties 3 Anchor words Preliminaries X 1,, X n p (x; θ), θ = (θ 1,
More informationPrincipal Component Analysis
Principal Component Analysis Introduction Consider a zero mean random vector R n with autocorrelation matri R = E( T ). R has eigenvectors q(1),,q(n) and associated eigenvalues λ(1) λ(n). Let Q = [ q(1)
More informationMinimax Rate of Convergence for an Estimator of the Functional Component in a Semiparametric Multivariate Partially Linear Model.
Minimax Rate of Convergence for an Estimator of the Functional Component in a Semiparametric Multivariate Partially Linear Model By Michael Levine Purdue University Technical Report #14-03 Department of
More informationExpectation Propagation Algorithm
Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,
More informationCopula modeling for discrete data
Copula modeling for discrete data Christian Genest & Johanna G. Nešlehová in collaboration with Bruno Rémillard McGill University and HEC Montréal ROBUST, September 11, 2016 Main question Suppose (X 1,
More informationIndependent Component Analysis
Independent Component Analysis Philippe B. Laval KSU Fall 2017 Philippe B. Laval (KSU) ICA Fall 2017 1 / 18 Introduction Independent Component Analysis (ICA) falls under the broader topic of Blind Source
More informationProbability Distribution And Density For Functional Random Variables
Probability Distribution And Density For Functional Random Variables E. Cuvelier 1 M. Noirhomme-Fraiture 1 1 Institut d Informatique Facultés Universitaires Notre-Dame de la paix Namur CIL Research Contact
More informationIndependent Components Analysis
CS229 Lecture notes Andrew Ng Part XII Independent Components Analysis Our next topic is Independent Components Analysis (ICA). Similar to PCA, this will find a new basis in which to represent our data.
More informationJune 21, Peking University. Dual Connections. Zhengchao Wan. Overview. Duality of connections. Divergence: general contrast functions
Dual Peking University June 21, 2016 Divergences: Riemannian connection Let M be a manifold on which there is given a Riemannian metric g =,. A connection satisfying Z X, Y = Z X, Y + X, Z Y (1) for all
More informationConditional Least Squares and Copulae in Claims Reserving for a Single Line of Business
Conditional Least Squares and Copulae in Claims Reserving for a Single Line of Business Michal Pešta Charles University in Prague Faculty of Mathematics and Physics Ostap Okhrin Dresden University of Technology
More informationCopulas and Measures of Dependence
1 Copulas and Measures of Dependence Uttara Naik-Nimbalkar December 28, 2014 Measures for determining the relationship between two variables: the Pearson s correlation coefficient, Kendalls tau and Spearmans
More informationNatural Image Statistics
Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex
More informationNonparametric Estimation of the Dependence Function for a Multivariate Extreme Value Distribution
Nonparametric Estimation of the Dependence Function for a Multivariate Extreme Value Distribution p. /2 Nonparametric Estimation of the Dependence Function for a Multivariate Extreme Value Distribution
More informationStochastic orders: a brief introduction and Bruno s contributions. Franco Pellerey
Stochastic orders: a brief introduction and Bruno s contributions. Franco Pellerey Stochastic orders (comparisons) Among his main interests in research activity A field where his contributions are still
More information