Global Sensitivity Analysis
|
|
- Norman Reeves
- 5 years ago
- Views:
Transcription
1 Global Sensitivity Analysis Elmar Plischke Institut für Endlagerforschung TU Clausthal Risk Institute Easter School, Liverpool, April 217 IELF, TU Clausthal GSA 1
2 Contents Variance-based Sensitivity Analysis Sobol Method FAST: Motivation FAST: Implementation Variants of FAST General Framework for Sensitivity Measures Moment-Independent Importance Measures Transformation Invariance Given Data Estimation IELF, TU Clausthal GSA 2
3 Global Sensivity Analysis Global Sensitivity Analysis: Identify sources of variation in the output S y=g(x) Input factors System model Output statistics Uncertainty in inputs: random variables X i, i = 1,..., k Deterministic simulator producing scalar output y (as RV: Y) (Q)MC sampling for propagating uncertainties IELF, TU Clausthal GSA 3
4 Functional ANOVA decomposition Any multivariate integrable mapping g can be decomposed as follows: g(x) = g + n i=1 g i (x i ) + j>i g i,j (x i, x j ) + + g 1,2,...,n (x 1, x 2,..., x n ) if the input probability distribution is independent, F X (x) = i F i(x i ); g = g(x)df X (x) g i (x i ) = g(x) df k (x k ) g k i g i,j (x i, x j ) = g(x) df k (x k ) g i (x i ) g j (x j ) g k i,j IELF, TU Clausthal GSA 4
5 Functional ANOVA decomposition A strong annihilating condition holds: If l {i 1,..., i k } then g i1,...,i k (x i1,..., x ik )df l (x l ) =. IELF, TU Clausthal GSA 5
6 Functional ANOVA decomposition A strong annihilating condition holds: If l {i 1,..., i k } then g i1,...,i k (x i1,..., x ik )df l (x l ) =. This implies orthogonality of the g α family. The output variance can be decomposed into V[Y] = α V α where V α = (g α (x α )) 2 df α (x α ) ( with multiindex α). IELF, TU Clausthal GSA 5
7 Functional ANOVA decomposition A strong annihilating condition holds: If l {i 1,..., i k } then g i1,...,i k (x i1,..., x ik )df l (x l ) =. This implies orthogonality of the g α family. The output variance can be decomposed into V[Y] = α V α where V α = (g α (x α )) 2 df α (x α ) ( with multiindex α). Variance-based Sensitivity Output variance is apportioned to single input parameters or groups of input prameters IELF, TU Clausthal GSA 5
8 Functional ANOVA decomposition S i = V {i} V[Y] S T i = S sub α = S sup α = D eff i α Vα V[Y] β α V β V[Y] β α V β V[Y] Sobol main/first order effect, correlation ratio total effect subset importance superset importance mean = k i=1 α: α =i α V α = k i=1 S T i mean effective dimensionality [Liu and Owen, 26] IELF, TU Clausthal GSA 6
9 Variance-Based Sensitivity Analysis First order effects: Factor prioritization, model structure (sum of all first order effects) Total effects: Factor fixing Sobol method or extended Fourier Amplitude Sensitivity Test (efast) need special sampling schemes (e.g. Sobol ): IELF, TU Clausthal GSA 7
10 Sobol Method 1. Generate a (quasi-)random sample with N points in 2k dimensions. 2. Split the matrix so that the first k columns are denoted matrix A, and the remaining k columns are denoted B. 3. For a given input parameter i, construct a matrix A i B which consists of all the columns of A, except the ith column, which is taken from B. 4. Now an estimation of V i, the numerator of S i, is given by ˆV i = 1 N g(b)t (g(a i B ) g(a)) 5. An estimation of S T i is given by Jansen s formula 1 2NˆV[Y] (g(ai B ) g(a))t (g(a i B ) g(a)) which is shown to be numerically more stable than 1 1 NˆV[Y] g(a)t (g(a i B ) g(b)) IELF, TU Clausthal GSA 8
11 y y y y y y y y Sobol Method Sobol indices are constructed from multiple OAT designs! Total costs: N(1 + k) for Jansen, N(2 + k) for Sobol /Ishigami/Homma/Saltelli approach Sobol method uses correlation coefficients between the output y A or y B and y i. Ishigami: main effect Ishigami: main effect Ishigami: main effect Ishigami: main effect y_{1} y_{2} y_{3} y_{4} Ishigami: total effect Ishigami: total effect Ishigami: total effect Ishigami: total effect y_{1} y_{2} y_{3} y_{4} For main effect: Large correlation corresponds to large sensitivity, for total effect: Small correlation corresponds to large sensitivity IELF, TU Clausthal GSA 9
12 Random vs. Quasi-Random Sampling Sampling for the Sobol method can be derived from 1. a random (pseudo-random, if algorithmically generated) source; or 2. a quasi-random source, like Sobol LPτ or Halton low-discrepancy series. IELF, TU Clausthal GSA 1
13 Random vs. Quasi-Random Sampling Sampling for the Sobol method can be derived from 1. a random (pseudo-random, if algorithmically generated) source; or 2. a quasi-random source, like Sobol LPτ or Halton low-discrepancy series. ( ) ( ) The MC error is of order o 1 log N while the QMC error is of order o k N N. IELF, TU Clausthal GSA 1
14 Random vs. Quasi-Random Sampling Sampling for the Sobol method can be derived from 1. a random (pseudo-random, if algorithmically generated) source; or 2. a quasi-random source, like Sobol LPτ or Halton low-discrepancy series. ( ) ( ) The MC error is of order o 1 log N while the QMC error is of order o k N N. IELF, TU Clausthal GSA 1
15 Random vs. Quasi-Random Sampling Sampling for the Sobol method can be derived from 1. a random (pseudo-random, if algorithmically generated) source; or 2. a quasi-random source, like Sobol LPτ or Halton low-discrepancy series. ( ) ( ) The MC error is of order o 1 log N while the QMC error is of order o k N N. IELF, TU Clausthal GSA 1
16 Fourier Amplitude Sensitivity Test FAST introduced in [Cukier et al., 1973, Schaibly and Shuler, 1973, Cukier et al., 1975, Cukier et al., 1978] Artificial timeframe: Each input gets assigned its unique frequency The output is scanned for resonances Attributing the resonances to the the frequencies gives the contribution to output variance IELF, TU Clausthal GSA 11
17 Motivation Ingredients 1. Ergodicity 2. Search Curves 3. Superposition Principle 4. Parseval s Theorem IELF, TU Clausthal GSA 12
18 Ergodic Theorem Ergodic Theorem [Arnold and Avez, 1968] Let Φ t be a dynamical system on (T, M, µ). If Φ t is ergodic then space average equals time averages: for f L 1 (M, µ) and almost all x M it holds that M 1 T f(x)dµ(x) = lim f(φ t x )dt. T T Φ t mixing & measure-preserving Trajectory visits every point IELF, TU Clausthal GSA 13
19 Ergodic Theorem Ergodic Theorem [Arnold and Avez, 1968] Let Φ t be a dynamical system on (T, M, µ). If Φ t is ergodic then space average equals time averages: for f L 1 (M, µ) and almost all x M it holds that M 1 T f(x)dµ(x) = lim f(φ t x )dt. T T Φ t mixing & measure-preserving Trajectory visits every point LHS: Via Monte Carlo Integral RHS: Sample along a trajectory IELF, TU Clausthal GSA 13
20 Example for Ergodic Systems: Toroidal Shifts ẋ 1 = x 1 mod 1 ẋ 2 = ωx 2 mod 1, with ω Q: Space-filling trajectory along a torus 1 t > ( 1 /sqrt(2) t, 1 /sqrt(3) t) mod x x 2 IELF, TU Clausthal 1 4 GSA 14 4
21 Unfortunately... Space-filling only for infinite time Wrap-around introduces discontinuities: Periodicity needed f(x, ) = f(x, 1), f(, y) = f(1, y)? IELF, TU Clausthal GSA 15
22 Unfortunately... Space-filling only for infinite time Wrap-around introduces discontinuities: Periodicity needed f(x, ) = f(x, 1), f(, y) = f(1, y)? The way out: Search Curves Reflexion instead of wrap-around Closed curves IELF, TU Clausthal GSA 15
23 Search curve 1 t > ( 5/11 t, 5/9 t) mod 1 1 With Reflexion 1 Power Spectrum x 1 x x 2.5 x x x Frequency Different input factors/dimensions can be identified by different frequencies! IELF, TU Clausthal GSA 16
24 Superposition Principle Product of harmonic functions = sum of harmonic functions 2 cos α cos β = cos(α + β) + cos(α β) 2 sin α cos β = sin(α + β) + sin(α β) 2 sin α sin β = cos(α β) cos(α + β) Powers: Multiples of the frequencies (higher harmonics) Interactions: Resonances in sums and diffs of the frequencies Multiplication Addition IELF, TU Clausthal GSA 17
25 Power Spectrum The power spectrum gives the portion of a signal s power falling within given frequency bins MathWorld. Additive decomposition of the signal s energy. Variance is the signal s energy! IELF, TU Clausthal GSA 18
26 Parseval s Theorem Variance is invariant under orthonormal transformations: V[Y] = V[FY] where F is the Fourier transformation. Identifying the contributions from input parameters and interactions to the output Functional ANOVA decomposition Power spectrum gives first- and higher order effects IELF, TU Clausthal GSA 19
27 Putting Things Together Choose maximal harmonic M as interference factor Assign frequencies ω i. Sample size: Shannon sampling theorem requires n > 2M k i=1 ω i (Nyquist frequency) Sample (u j,i ) from multi-dimensional search curve Apply a transformation using inverse cdfs, x j,i = F 1 i (u j,i ) Evaluate model y j = f(x j,1,..., x j,k ) Apply a Fast Fourier Transform (FFT) to (y j ) yielding complex Fourier coefficients c m Collect the resonances from the power spectrum for first order effects: M m=1 S i = 2 cmω i m 2. cm 2 IELF, TU Clausthal GSA 2
28 Input factors x 1 x 2 Model f Output y k x k Spectral analysis Output Variance V v k v 1 v 2 Resonances Fourier amplitude Sensitivity Index = v i i V Frequencies IELF, TU Clausthal GSA 21
29 Detail: Maximum Harmonic Normally, max. harmonic is M = 4 to 6. If the simulation model is continuous, the Fourier coefficients decay quadratically. More harmonics are needed if the function is discontinuous. 1 f1 2 Power Spectrum f1 1 f2 25 Power Spectrum f Very Flat.2 5 More Noise IELF, TU Clausthal GSA 22
30 Detail: Frequency assignment The choice ω i = ω i 1 with ω = 2M + 1 allows to identify all effects uniquely upto harmonic M. More elaborate algorithms are available optimizing the use of the frequencies. IELF, TU Clausthal GSA 23
31 Detail: Sample Design Fill the unit hypercube along a search curve: u j,i = 1 π arccos(cos(2πω i(r i + j n ))), i = 1,..., k j = 1,..., n Here r i is an additional random shift IELF, TU Clausthal GSA 24
32 Detail: Fourier Transformation Written explictly, c m = n j=1 y j ζ n (j 1)m, m =,..., n 1 with complex unit root ζ n = e 2πi n The powers of the unit root can be cleverly reused, resulting in fast implementations F 3 = ( i) 2 ( i) ( 1 1 3i) 2 (1 +, F 4 = 1 1i 1 1i i) 1 1i 1 1i Not all coefficent are needed, the total variance can be computed the classical way (as sum of squares). IELF, TU Clausthal GSA 25
33 FAST with sample size 8192: Ishigami function Ishigami FAST 4 Inputs Index Ishigami FAST Index Power Spectrum of Output Output 2 Variance IELF, TU Clausthal Frequency GSA 26
34 Lower Plot: Explanations of Resonances Frequencies used: ω 1 = 1, ω 2 = 9, ω 3 = 81 (max. harmonic M = 4) Blue lines Main effects. First line: linear part Green lines Two-term interaction effects. Symmetry! Red lines Three-term interaction effects: Nothing visible. Active : x 1, x 3 1, x4 2, x 1x 2 3 x3 1 x2 3, x 1x 4 3 x3 1 x4 3 IELF, TU Clausthal GSA 27
35 FAST: A minimal MATLAB implementation % k, model(), trafo() provided M=4; freq=(2*m+1).^(:(k 1)); n=2*(2*m+1)^k; % Full %M=4; freq =[11,21,31];n=2*M*sum(freq); % Manual u=acos(cos(2*pi*linspace(1/2/n,1 1/2/n,n)'*freq))/pi; x=trafo(u);y=model(x); % Model evaluation spect=(abs(fft(y))).^2/n; V=sum(spect(2:n)); % Spectrum stem(2*spect(2:(floor(n/2)))/v); % Visualization Si=2*sum(spect(1+(1:M)'*freq))/V % Main effects IELF, TU Clausthal GSA 28
36 What about totals? With the above-mentioned frequency scheme, ω M k 1 l= ωl = 1 2 (ωk 1) can be uniquely decomposed: ω = k i=1 α i (ω)ω i 1, α i (ω) { M,..., 1,, 1,..., M}. If α i (ω) then ω contributes to the total effect of input factor i. Ŝ Ti = 2 α i (ω) c ω 2 α m c m 2, Ŝ Ti = 1 2 i (ω)= c ω 2 m c m 2 Higher order effects: Combining the zero patterns of the α i (ω) IELF, TU Clausthal GSA 29
37 Extended FAST (efast) [Saltelli et al., 1999]: Frequency selection scheme for first and total effects A factor i of interest is assigned to a relative large frequency ω i 1 and all others are assigned to low frequencies (say, ω j i = 1). Total effects: all frequencies below ω T = ω i M j i ω j do not contribute to the variance from factor i up to the Mth order. k (small) sample blocks are needed. m=ω Ŝ Ti = 1 2 T 1 c mω i 2 m c m 2 IELF, TU Clausthal GSA 3
38 Random Balance Design (RBD) [Tarantola et al., 26]: For first order effects Create a uniform sample u [, 1] n by sampling from u : s 1 2s 1 = 1 π arccos(cos(2πs)) Find permutations π j such that u j = π j (u) are uncorrelated. Transform the marginal distributions x j = F 1 j (u j ). Evaluate the model output y = f(x 1,..., x k ). Apply inverse permutations to the output, y j = π j (y). Transform the permuted output y j via DFT which yields cm j = n l=1 exp ( 2πi(l 1) m ) j n y l, m =, ±1,..., ± n 2. ( M Estimate the sensitivity Ŝj = 2 cmω j 2) ( m m=1 cm j 2) 1. IELF, TU Clausthal GSA 31
39 RBD: A minimal MATLAB implementation % M, n, k, model(), trafo() provided s=(2*(1:n)' (n+1))/n; u=acos( cos(pi*s))/pi; [,perm]=sort(rand(n,k)); % Random Permutation x=trafo(u(perm));y=model(x); ys=zeros(n,k);for i=1:k; ys(perm(:,i),i)=y; end spect=(abs(fft(ys))).^2/n; V=sum(spect(2:n,1)); Si=2*sum(spect(1+(1:M),:))/V IELF, TU Clausthal GSA 32
40 Effective Algorithm for Sensitivity Indices (EASI) [Plischke, 21]: Sort and shuffle the positions in the sample: First order effects Sort-of inverse RBD: Construct the permutations from the observations 1 Triangular shape via sorting.9.8 x i x (i) x [i] Index IELF, TU Clausthal GSA 33
41 EASI: A minimal MATLAB implementation % x,y,m provided [n,k]=size(x); [,index]=sort(x); odd=mod(n,2); shuffle=[1:2:(n 1+odd), (n odd): 2:2]; ys=y(index(shuffle,:)); % Rearrange output spect=(abs(fft(ys))).^2/n; V=sum(spect(2:n,1)); Si=2*sum(spect(1+(1:M),:))/V % Collect Resonances IELF, TU Clausthal GSA 34
42 Example EASI 2 Ishigami Fourier Trafo 2 Ishigami Fourier Trafo y 5 y x x 2 2 Ishigami Fourier Trafo 2 Ishigami Fourier Trafo y 5 y x x 4 Two regression curves for even and odd indices IELF, TU Clausthal GSA 35
43 Second order effects (and higher if you dare) Given data triplets {(x 1 i, x2 i, y i)} (as realizations of some RVs), compute the joint influence of X 1 and X 2 on Y Method in a nutshell Sort the (x 1, x 2 ) data along a search curve ( nearest neighbor) with a distinct frequency behaviour. Reorder output accordingly Look out for resonances. IELF, TU Clausthal GSA 36
44 Search Curve: Plow Track Code the (x 1, x 2 ) position by the length of the search curve Curve has detectable frequency behaviour per dimension IELF, TU Clausthal GSA 37
45 Search Curve: Ping-Pong Alternative curve with large freedom of choosing the frequencies IELF, TU Clausthal GSA 38
46 Indexed Inputs Ishigami Plow track 4 x 2 1 x Hyperindex Ishigami Plow track 2 Output 1 Fraction of Variance Hyperindex Power Spectrum of Output Frequency Parameter 1 3 1,3 η IELF, TU Clausthal GSA 39
47 5 Ishigami Ping Pong Curve Inputs (sorted) x 1 x 3 Variance Contribution Index Ishigami Ping Pong Curve Output (reordered) Index Ishigami Ping Pong Curve Power Spectrum of Output Frequency Parameter 1 3 1,3 η IELF, TU Clausthal GSA 4
48 What if Variance is not a suitable Measure of Uncertainty? Use of suitable output transformations But: interpretation of the results on the original scale is difficult IELF, TU Clausthal GSA 41
49 What if Variance is not a suitable Measure of Uncertainty? Use of suitable output transformations But: interpretation of the results on the original scale is difficult Instead of variance-based, use moment-independent indicators which consider the whole distribution instead of single moments. IELF, TU Clausthal GSA 41
50 General Frameworks for Sensitivity and Importance Measures Comparing the joint distribution with the product of the marginals If d(, ) is a 2-dimensional functional distance/divergence: d((x i, Y), X i Y) Copula-Based Approaches, Discrepancy, Tests of statistical independence IELF, TU Clausthal GSA 42
51 General Frameworks for Sensitivity and Importance Measures Comparing the joint distribution with the product of the marginals If d(, ) is a 2-dimensional functional distance/divergence: d((x i, Y), X i Y) Copula-Based Approaches, Discrepancy, Tests of statistical independence Discrepancy (2D analogon of Kolmogorov-Smirnov): Sometimes counter-intuitive IELF, TU Clausthal GSA 42
52 General Frameworks for Sensitivity and Importance Measures Average of comparing the input distribution with the conditional input distibutions If d(, ) is a 1-dimensional functional distance/divergence: Reliability / Regionalized Sensitivity E[d(X, X Y i )] IELF, TU Clausthal GSA 42
53 General Frameworks for Sensitivity and Importance Measures Average of comparing the input distribution with the conditional input distibutions If d(, ) is a 1-dimensional functional distance/divergence: E[d(X, X Y i )] Reliability / Regionalized Sensitivity Mostly of interest for extreme-valued output IELF, TU Clausthal GSA 42
54 General Frameworks for Sensitivity and Importance Measures Average of comparing the output distribution with the conditional output distibutions If d(, ) is a 1-dimensional functional distance/divergence: Importance measures E[d(Y, Y X i )] IELF, TU Clausthal GSA 42
55 General Frameworks for Sensitivity and Importance Measures Average of comparing the output distribution with the conditional output distibutions If d(, ) is a 1-dimensional functional distance/divergence: E[d(Y, Y X i )] Importance measures Rest of talk focusses on these moment-independent importance measures. IELF, TU Clausthal GSA 42
56 Visual impressions: Non-functional dependence Letter P IELF, TU Clausthal GSA 43
57 Visual impressions: Non-functional dependence 1 Linear Regression R 2 = IELF, TU Clausthal GSA 43
58 Visual impressions: Non-functional dependence 1 Nonlinear Regression η 2 = IELF, TU Clausthal GSA 43
59 Visual impressions: Non-functional dependence Product of Marginals D * = IELF, TU Clausthal GSA 43
60 Visual impressions: Non-functional dependence Conditioning on y IELF, TU Clausthal GSA 43.8
61 Visual impressions: Non-functional dependence.45 Conditioning on y delta(y,x)= IELF, TU Clausthal GSA 43
62 Visual impressions: Non-functional dependence Conditioning on x IELF, TU Clausthal GSA
63 Visual impressions: Non-functional dependence.45 Conditioning on x delta(x,y)= IELF, TU Clausthal GSA 43
64 Moment-Independent Importance Measures ζ i = E[d(Y, Y X i )] d(, ): Shift or separation function (functional metric) Bayesian Interpretation: Degree of belief before and after getting to know that X i = x i, averaged over all possible X i IELF, TU Clausthal GSA 44
65 Examples for Shift/Separation Measures ζ EI (µ Y, µ Y X=x ) = max{µ Y X=x, } max{µ Y, } EVPI, null alternative ζ SI (µ Y, µ Y X=x ) = σ 2 Y (µ Y µ Y X=x ) 2 Main Effect ζ KS (F Y, F Y X=x ) = sup FY F Y X=x Kolmogorov-Smirnov ζ Ku (F Y, F Y X=x ) = sup ( F Y F Y X=x ) inf ( FY F Y X=x ) Kuiper ζ CvM (F Y, F Y X=x ) = 1 2 (FY X=x (y) F Y (y) ) 2 dy Cramér, L 2 (cdf) fy X=x ζ Bo (f Y, f Y X=x ) = 2 1 (y) f Y (y) dy Borgonovo, L 1 (pdf) ζ KL (f Y, f Y X=x ) = f Y X=x (y) log f Y X=x(y) dy f Y (y) Kullback-Leibler ζ He (f Y, f Y X=x ) = 1 f Y (y) f Y X=x (y)dy Hellinger IELF, TU Clausthal GSA 45
66 Which separation to use? Looking for sensitivity importance measures which are Simple to interpret Easy to estimate Invariant under monotonic transformations of inputs and outputs Detecting strong functional links: Y = g(x) = E[ζ(Y, Y X)] = 1 Offer a test for independence: E[ζ(Y, Y X)] = Y and X are independent No one size fits all sensitivity method IELF, TU Clausthal GSA 46
67 Moment-Independent Importance Measures II For moment-independent importance, separation measures are between 1. Cumulative Distribution Functions 2. Probabilistic Density Functions 3. Characteristic Functions IELF, TU Clausthal GSA 47
68 CDF-based Measures Kolmogorov-Smirnov and Kuiper separation 1 Cumulative Distributions KS: largest distance Kuiper: max. positive distance minus min. negative distance IELF, TU Clausthal GSA 48
69 PDF-based Measures Borgonovo separation: (signed) area under the curves Kullback-Leibler: Entropy.4 Densities IELF, TU Clausthal GSA 49
70 CF-based Measure CF: φ X (s) = E[e isx ] = e isx f X (x)dx Inverse Fourier transform of pdf: Complex-valued, no finite support Distance Covariance [Székely and Rizzo, 213]: φ dcov 2 X,Y (s, t) φ X (s)φ Y (t) 2 (X, Y) = C R 2 s 2 t 2 dsdt Parseval s Theorem: Sampling-based estimators are available. IELF, TU Clausthal GSA 5
71 CF-based Measure CF: φ X (s) = E[e isx ] = e isx f X (x)dx Inverse Fourier transform of pdf: Complex-valued, no finite support Distance Covariance [Székely and Rizzo, 213]: φ dcov 2 X,Y (s, t) φ X (s)φ Y (t) 2 (X, Y) = C R 2 s 2 t 2 dsdt Parseval s Theorem: Sampling-based estimators are available. Many open topics here! IELF, TU Clausthal GSA 5
72 Properties of MIM For Variance-Based Sensitivity Measures: log transformation of the output switches from additive (ANOVA) decomposition to multiplicative decompositions, other transformations are also available (Box Cox, probit,logit). Wanted: A Sensitivity Measure that is invariant with respect to transformations (Sensitivity then becomes topological property). IELF, TU Clausthal GSA 51
73 Properties of MIM For Variance-Based Sensitivity Measures: log transformation of the output switches from additive (ANOVA) decomposition to multiplicative decompositions, other transformations are also available (Box Cox, probit,logit). Wanted: A Sensitivity Measure that is invariant with respect to transformations (Sensitivity then becomes topological property). [Borgonovo et al., 214] The sensitivity measure ξ is transformation invariant if the separation is given by ζ(p, Q) = sup A A h ( P(A) Q(A) ) (generalized Birnbaum Orlicz) ζ(f Y, f Z ) = ) H f Y (y)dy (Csiszár divergence) ( fz (y) f Y (y) IELF, TU Clausthal GSA 51
74 Given Data Methodology X Y k-dimensional random vector random variable (quantity of interest for time series) IELF, TU Clausthal GSA 52
75 Given Data Methodology X Y k-dimensional random vector random variable (quantity of interest for time series) Physical observations Uncertainty propagation through model Y = g(x) IELF, TU Clausthal GSA 52
76 Given Data Methodology X Y k-dimensional random vector random variable (quantity of interest for time series) Physical observations Uncertainty propagation through model Y = g(x) Simple random sampling of X Latin Hypercube sampling of X Quasi Monte Carlo sampling (Sobol LPτ,... ) of X But not fast multidimensional/sparse grid quadrature designs IELF, TU Clausthal GSA 52
77 Given Data Methodology X Y k-dimensional random vector random variable (quantity of interest for time series) Physical observations Uncertainty propagation through model Y = g(x) Simple random sampling of X Latin Hypercube sampling of X Quasi Monte Carlo sampling (Sobol LPτ,... ) of X But not fast multidimensional/sparse grid quadrature designs Sample must represent the underlying probabilistic framework. Observations are independent realizations of (X, Y). IELF, TU Clausthal GSA 52
78 Examples for 2D Uniform [, 1] Input Samples x 2 x 2 Simple Random Sample x 1 Uniform Design x 1 x 2 x 2 Latin Hypercube Sample x 1 Full Factorial Design x 1 x 2 x 2 Quasi Monte Carlo Sample x 1 Sparse Grid Design x 1 Red: Bad setup. But fine for a meta-modeling layer Worse space-filling properties in higher dimensions IELF, TU Clausthal GSA 53
79 Going beyond Linear Regression? [Pearson, 1912]: Nothing can be learnt of association by assuming linearity in a case with a regression line (plane, etc.) like A, much in a case like B. A sensitivity measure has always to be interpreted with respect to the used method Report the goodness-of-fit for the method R 2 for linear regression R 2 for rank linear regression n i S i for variance-based first order effects Sum of variance-based first order and higher order effects Successively use more advanced techniques IELF, TU Clausthal GSA 54
80 F Back to the Roots Correlation Ratios [Pearson, 195]: piecewise constant regression model for local means E[Y X i = x i ] Histogram binning: Estimate local cdfs/pdfs for use with separation measures Y X 1 IELF, TU Clausthal GSA 55
81 Thank You! Questions, Comments Preprints, Scripts, Stuff IELF, TU Clausthal GSA 56
82 References I Arnold, V. I. and Avez, A. (1968). Ergodic Problems of Classical Mechanics. Benjamin, New York. Borgonovo, E., Tarantola, S., Plischke, E., and Morris, M. D. (214). Transformations and invariance in the sensitivity analysis of computer experiments. Journal of the Royal Statistical Society, Series B, 76: Cukier, R. I., Fortuin, C. M., Shuler, K. E., Petschek, A. G., and Schaibly, J. H. (1973). Study of the sensitivity of coupled reaction systems to uncertainties in rate coefficients. I. Theory. J. Chem. Phys., 59: Cukier, R. I., Levine, H. B., and Shuler, K. E. (1978). Nonlinear sensitivity analysis of multiparameter model systems. J. Comput. Phys., 26(1):1 42. Cukier, R. I., Schaibly, J. H., and Shuler, K. E. (1975). Study of the sensitivity of coupled reaction systems to uncertainties in rate cofficients. III. Analysis of the approximations. J. Chem. Phys., 63: Liu, R. and Owen, A. B. (26). Estimating mean dimensionality of analysis of variance decompositions. Journal of the American Statistical Association, 11(474): IELF, TU Clausthal GSA 57
83 References II Pearson, K. (195). On the General Theory of Skew Correlation and Non-linear Regression, volume XIV of Mathematical Contributions to the Theory of Evolution, Drapers Company Research Memoirs. Dulau & Co., London. Pearson, K. (1912). On the general theory of the influence of selection on correlation and variation. Biometrika, 8(3 4): Plischke, E. (21). An effective algorithm for computing global sensitivity indices (EASI). Reliability Engineering&System Safety, 95(4): Saltelli, A., Tarantola, S., and Chan, K. (1999). A quantitative, model independent method for global sensitivity analysis of model output. Technometrics, 41: Schaibly, J. H. and Shuler, K. E. (1973). Study of the sensitivity of coupled reaction systems to uncertainties in rate coefficients. II. Applications. J. Chem. Phys., 59: IELF, TU Clausthal GSA 58
84 References III Székely, G. J. and Rizzo, M. L. (213). Energy statistics: A class of statistics based on distances. Journal of Statistical Planning and Inference, 143: Tarantola, S., Gatelli, D., and Mara, T. A. (26). Random balance designs for the estimation of first order global sensitivity indices. Reliability Engineering&System Safety, 91: IELF, TU Clausthal GSA 59
Some methods for sensitivity analysis of systems / networks
Article Some methods for sensitivity analysis of systems / networks WenJun Zhang School of Life Sciences, Sun Yat-sen University, Guangzhou 510275, China; International Academy of Ecology and Environmental
More informationLectures. Variance-based sensitivity analysis in the presence of correlated input variables. Thomas Most. Source:
Lectures Variance-based sensitivity analysis in the presence of correlated input variables Thomas Most Source: www.dynardo.de/en/library Variance-based sensitivity analysis in the presence of correlated
More informationPolynomial chaos expansions for sensitivity analysis
c DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Polynomial chaos expansions for sensitivity analysis B. Sudret Chair of Risk, Safety & Uncertainty
More informationSobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis
Sobol-Hoeffding decomposition Application to Global SA Computation of the SI Sobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis Olivier Le Maître with Colleague & Friend Omar
More informationEvaluating prediction uncertainty in simulation models
Submitted to Computer Physics Communications LA-UR-98 1362 Evaluating prediction uncertainty in simulation models Michael D. McKay, 1 John D. Morrison, Stephen C. Upton Los Alamos National Laboratory Los
More informationIntroduction to Statistical Methods for High Energy Physics
Introduction to Statistical Methods for High Energy Physics 2011 CERN Summer Student Lectures Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic - not
More informationBivariate Paired Numerical Data
Bivariate Paired Numerical Data Pearson s correlation, Spearman s ρ and Kendall s τ, tests of independence University of California, San Diego Instructor: Ery Arias-Castro http://math.ucsd.edu/~eariasca/teaching.html
More information3 Operations on One Random Variable - Expectation
3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can
More informationTotal interaction index: A variance-based sensitivity index for second-order interaction screening
Total interaction index: A variance-based sensitivity index for second-order interaction screening J. Fruth a, O. Roustant b, S. Kuhnt a,c a Faculty of Statistics, TU Dortmund University, Vogelpothsweg
More informationGlobal Sensitivity Analysis in Structural Optimization
Global Sensitivity Analysis in Structural Optimization Uwe Reuter Department of Civil Engineering, TU Dresden, Germany Martin Liebscher and Heiner Müllerschön DYNAmore GmbH, Stuttgart, Germany Summary:
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationA Unified Framework for Uncertainty and Sensitivity Analysis of Computational Models with Many Input Parameters
A Unified Framework for Uncertainty and Sensitivity Analysis of Computational Models with Many Input Parameters C. F. Jeff Wu H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute
More informationHANDBOOK OF APPLICABLE MATHEMATICS
HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume VI: Statistics PART A Edited by Emlyn Lloyd University of Lancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationAsymptotic distribution of the sample average value-at-risk
Asymptotic distribution of the sample average value-at-risk Stoyan V. Stoyanov Svetlozar T. Rachev September 3, 7 Abstract In this paper, we prove a result for the asymptotic distribution of the sample
More informationSENSITIVITY ANALYSIS IN NUMERICAL SIMULATION OF MULTIPHASE FLOW FOR CO 2 STORAGE IN SALINE AQUIFERS USING THE PROBABILISTIC COLLOCATION APPROACH
XIX International Conference on Water Resources CMWR 2012 University of Illinois at Urbana-Champaign June 17-22,2012 SENSITIVITY ANALYSIS IN NUMERICAL SIMULATION OF MULTIPHASE FLOW FOR CO 2 STORAGE IN
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationGaussian Process Regression and Emulation
Gaussian Process Regression and Emulation STAT8810, Fall 2017 M.T. Pratola September 22, 2017 Today Experimental Design; Sensitivity Analysis Designing Your Experiment If you will run a simulator model,
More informationMultivariate Distribution Models
Multivariate Distribution Models Model Description While the probability distribution for an individual random variable is called marginal, the probability distribution for multiple random variables is
More informationPiecewise Linear Approximations of Nonlinear Deterministic Conditionals in Continuous Bayesian Networks
Piecewise Linear Approximations of Nonlinear Deterministic Conditionals in Continuous Bayesian Networks Barry R. Cobb Virginia Military Institute Lexington, Virginia, USA cobbbr@vmi.edu Abstract Prakash
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationA comparison of global sensitivity techniques and sampling method
22nd International Congress on Modelling and Simulation, Hobart, Tasmania, Australia, 3 to 8 December 2017 mssanz.org.au/modsim2017 A comparison of global sensitivity techniques and sampling method X.
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationStructural Reliability
Structural Reliability Thuong Van DANG May 28, 2018 1 / 41 2 / 41 Introduction to Structural Reliability Concept of Limit State and Reliability Review of Probability Theory First Order Second Moment Method
More informationIntroduction to Smoothing spline ANOVA models (metamodelling)
Introduction to Smoothing spline ANOVA models (metamodelling) M. Ratto DYNARE Summer School, Paris, June 215. Joint Research Centre www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting
More informationHochdimensionale Integration
Oliver Ernst Institut für Numerische Mathematik und Optimierung Hochdimensionale Integration 14-tägige Vorlesung im Wintersemester 2010/11 im Rahmen des Moduls Ausgewählte Kapitel der Numerik Contents
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationL2: Review of probability and statistics
Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution
More informationUncertainty Quantification and Validation Using RAVEN. A. Alfonsi, C. Rabiti. Risk-Informed Safety Margin Characterization. https://lwrs.inl.
Risk-Informed Safety Margin Characterization Uncertainty Quantification and Validation Using RAVEN https://lwrs.inl.gov A. Alfonsi, C. Rabiti North Carolina State University, Raleigh 06/28/2017 Assumptions
More informationInference in Hybrid Bayesian Networks with Nonlinear Deterministic Conditionals
KU SCHOOL OF BUSINESS WORKING PAPER NO. 328 Inference in Hybrid Bayesian Networks with Nonlinear Deterministic Conditionals Barry R. Cobb 1 cobbbr@vmi.edu Prakash P. Shenoy 2 pshenoy@ku.edu 1 Department
More informationcomponent risk analysis
273: Urban Systems Modeling Lec. 3 component risk analysis instructor: Matteo Pozzi 273: Urban Systems Modeling Lec. 3 component reliability outline risk analysis for components uncertain demand and uncertain
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationDependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.
MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationUncertainty Propagation
Setting: Uncertainty Propagation We assume that we have determined distributions for parameters e.g., Bayesian inference, prior experiments, expert opinion Ṫ 1 = 1 - d 1 T 1 - (1 - ")k 1 VT 1 Ṫ 2 = 2 -
More informationLectures on Statistical Data Analysis
Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk
More informationA Polynomial Chaos Approach to Robust Multiobjective Optimization
A Polynomial Chaos Approach to Robust Multiobjective Optimization Silvia Poles 1, Alberto Lovison 2 1 EnginSoft S.p.A., Optimization Consulting Via Giambellino, 7 35129 Padova, Italy s.poles@enginsoft.it
More informationFall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede
Hypothesis Testing: Suppose we have two or (in general) more simple hypotheses which can describe a set of data Simple means explicitly defined, so if parameters have to be fitted, that has already been
More informationRecall the Basics of Hypothesis Testing
Recall the Basics of Hypothesis Testing The level of significance α, (size of test) is defined as the probability of X falling in w (rejecting H 0 ) when H 0 is true: P(X w H 0 ) = α. H 0 TRUE H 1 TRUE
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationProduct measure and Fubini s theorem
Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω
More informationLocal and Global Sensitivity Analysis
Omar 1,2 1 Duke University Department of Mechanical Engineering & Materials Science omar.knio@duke.edu 2 KAUST Division of Computer, Electrical, Mathematical Science & Engineering omar.knio@kaust.edu.sa
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationIntroduction to Information Entropy Adapted from Papoulis (1991)
Introduction to Information Entropy Adapted from Papoulis (1991) Federico Lombardo Papoulis, A., Probability, Random Variables and Stochastic Processes, 3rd edition, McGraw ill, 1991. 1 1. INTRODUCTION
More informationStat 890 Design of computer experiments
Stat 890 Design of computer experiments Will introduce design concepts for computer experiments Will look at more elaborate constructions next day Experiment design In computer experiments, as in many
More informationCh3 Operations on one random variable-expectation
Ch3 Operations on one random variable-expectation Previously we define a random variable as a mapping from the sample space to the real line We will now introduce some operations on the random variable.
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationIntroduction to Statistical Methods for Understanding Prediction Uncertainty in Simulation Models
LA-UR-04-3632 Introduction to Statistical Methods for Understanding Prediction Uncertainty in Simulation Models Michael D. McKay formerly of the Statistical Sciences Group Los Alamos National Laboratory
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationStochastic Mode Reduction in Large Deterministic Systems
Stochastic Mode Reduction in Large Deterministic Systems I. Timofeyev University of Houston With A. J. Majda and E. Vanden-Eijnden; NYU Related Publications: http://www.math.uh.edu/ ilya Plan Mode-Elimination
More informationGaussian Processes for Computer Experiments
Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented
More informationVariance Decomposition of Nonlinear Systems
9 IEEE International Conference on Control and Automation Christchurch, New Zealand, December 9-, 9 ThAT. Variance Decomposition of Nonlinear Systems Wei Yu, David Wilson, Brent Young and Thomas Harris
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationCIFAR Lectures: Non-Gaussian statistics and natural images
CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity
More informationStatistical Methods for Handling Incomplete Data Chapter 2: Likelihood-based approach
Statistical Methods for Handling Incomplete Data Chapter 2: Likelihood-based approach Jae-Kwang Kim Department of Statistics, Iowa State University Outline 1 Introduction 2 Observed likelihood 3 Mean Score
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationA6523 Modeling, Inference, and Mining Jim Cordes, Cornell University
A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 19 Modeling Topics plan: Modeling (linear/non- linear least squares) Bayesian inference Bayesian approaches to spectral esbmabon;
More informationStatistics and Data Analysis
Statistics and Data Analysis The Crash Course Physics 226, Fall 2013 "There are three kinds of lies: lies, damned lies, and statistics. Mark Twain, allegedly after Benjamin Disraeli Statistics and Data
More information* Tuesday 17 January :30-16:30 (2 hours) Recored on ESSE3 General introduction to the course.
Name of the course Statistical methods and data analysis Audience The course is intended for students of the first or second year of the Graduate School in Materials Engineering. The aim of the course
More informationOne-at-a-Time Designs for Estimating Elementary Effects of Simulator Experiments with Non-rectangular Input Regions
Statistics and Applications Volume 11, Nos. 1&2, 2013 (New Series), pp. 15-32 One-at-a-Time Designs for Estimating Elementary Effects of Simulator Experiments with Non-rectangular Input Regions Fangfang
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -
More informationRobustness of Principal Components
PCA for Clustering An objective of principal components analysis is to identify linear combinations of the original variables that are useful in accounting for the variation in those original variables.
More informationApplicability of Quasi-Monte Carlo for lattice systems
Applicability of Quasi-Monte Carlo for lattice systems Andreas Ammon 1,2, Tobias Hartung 1,2,Karl Jansen 2, Hernan Leovey 3, Andreas Griewank 3, Michael Müller-Preussker 1 1 Humboldt-University Berlin,
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationA CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS
Statistica Sinica 26 (2016), 1117-1128 doi:http://dx.doi.org/10.5705/ss.202015.0240 A CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS Xu He and Peter Z. G. Qian Chinese Academy of Sciences
More informationIntegrating Correlated Bayesian Networks Using Maximum Entropy
Applied Mathematical Sciences, Vol. 5, 2011, no. 48, 2361-2371 Integrating Correlated Bayesian Networks Using Maximum Entropy Kenneth D. Jarman Pacific Northwest National Laboratory PO Box 999, MSIN K7-90
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationBayesian Methods with Monte Carlo Markov Chains II
Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University hslu@stat.nctu.edu.tw http://tigpbp.iis.sinica.edu.tw/courses.htm 1 Part 3
More informationCross entropy-based importance sampling using Gaussian densities revisited
Cross entropy-based importance sampling using Gaussian densities revisited Sebastian Geyer a,, Iason Papaioannou a, Daniel Straub a a Engineering Ris Analysis Group, Technische Universität München, Arcisstraße
More informationProbability theory. References:
Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationRandomized Quasi-Monte Carlo for MCMC
Randomized Quasi-Monte Carlo for MCMC Radu Craiu 1 Christiane Lemieux 2 1 Department of Statistics, Toronto 2 Department of Statistics, Waterloo Third Workshop on Monte Carlo Methods Harvard, May 2007
More informationSimulating Uniform- and Triangular- Based Double Power Method Distributions
Journal of Statistical and Econometric Methods, vol.6, no.1, 2017, 1-44 ISSN: 1792-6602 (print), 1792-6939 (online) Scienpress Ltd, 2017 Simulating Uniform- and Triangular- Based Double Power Method Distributions
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationAssessing Multivariate Normality using Normalized Hermite Moments
BIWI-TR-5 May 999 Assessing Multivariate Normality using Normalized Hermite Moments Christian Stoecklin, Christian Brechbühler and Gábor Székely Swiss Federal Institute of Technology, ETH Zentrum Communication
More informationIntroduction to Probability Theory
Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationStochastic Spectral Approaches to Bayesian Inference
Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to
More informationSparse polynomial chaos expansions in engineering applications
DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Sparse polynomial chaos expansions in engineering applications B. Sudret G. Blatman (EDF R&D,
More informationGeneralized Cramér von Mises goodness-of-fit tests for multivariate distributions
Hong Kong Baptist University HKBU Institutional Repository HKBU Staff Publication 009 Generalized Cramér von Mises goodness-of-fit tests for multivariate distributions Sung Nok Chiu Hong Kong Baptist University,
More informationKullback-Leibler Designs
Kullback-Leibler Designs Astrid JOURDAN Jessica FRANCO Contents Contents Introduction Kullback-Leibler divergence Estimation by a Monte-Carlo method Design comparison Conclusion 2 Introduction Computer
More informationThe regression model with one stochastic regressor.
The regression model with one stochastic regressor. 3150/4150 Lecture 6 Ragnar Nymoen 30 January 2012 We are now on Lecture topic 4 The main goal in this lecture is to extend the results of the regression
More informationNumerical Methods II
Numerical Methods II Prof. Mike Giles mike.giles@maths.ox.ac.uk Oxford University Mathematical Institute MC Lecture 13 p. 1 Quasi-Monte Carlo As in lecture 6, quasi-monte Carlo methods offer much greater
More informationOrder Statistics and Distributions
Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density
More informationMultilevel stochastic collocations with dimensionality reduction
Multilevel stochastic collocations with dimensionality reduction Ionut Farcas TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017 Outline 1 Motivation 2 Theoretical background Uncertainty
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationStatistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg
Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical
More informationRandom Matrix Eigenvalue Problems in Probabilistic Structural Mechanics
Random Matrix Eigenvalue Problems in Probabilistic Structural Mechanics S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html
More informationAnalysis of covariance (ANCOVA) using polynomial chaos expansions
Research Collection Conference Paper Analysis of covariance (ANCOVA) using polynomial chaos expansions Author(s): Sudret, Bruno; Caniou, Yves Publication Date: 013 Permanent Link: https://doi.org/10.399/ethz-a-0100633
More information