Estimation of probability density functions by the Maximum Entropy Method
|
|
- Deirdre Horton
- 5 years ago
- Views:
Transcription
1 Estimation of probability density functions by the Maximum Entropy Method Claudio Bierig, Alexey Chernov Institute for Mathematics Carl von Ossietzky University Oldenburg 3. Workshop of the GAMM AGUQ Dortmund, 4 March 28
2 Forward uncertainty propagation in complex systems: X = S(ξ) ξ = input (random variable, known pdf) X = output (random variable, unknown pdf ρ) S = System s action (deterministic, known) Aim: determine the pdf ρ of X (nonintrusively!) Note: only a finite amount of information can be pushed through S. 2
3 Outline: Methodology Error Analysis Experiments (i.a. rough random obstacle problem) 3
4 Idea : ( Fabio s talk on Monday) Determine point values ρ(x i ), for x < x < < x N Interpolate for x (x i, x i+ ) M. B. Giles, T. Nagapetyan, and K. Ritter, Multilevel Monte Carlo Approximation of Distribution Functions and Densities, SIAM/ASA J. Uncert. Quantif., 3 (25), pp Adv.: More point values better approximation (stability, conv.). Drawback: Many unknown parameters in the algorithm (complex). See also the antiderivative approach : S. Krumscheid and F. Nobile, Multilevel Monte Carlo approximation of functions, MATHICSE technical report Nr
5 Idea 2 (explore in this talk): Determine the (generalized) moments µ,..., µ R of X µ k = E[φ k (X )] Reconstruct η that satisfies the moment constraints: a) µ k = φ k (x)η(x) dx b) η(x) and η(x) dx =. C. Bierig and A. Chernov, Approximation of probability density functions by the Multilevel Monte Carlo Maximum Entropy method, J. Comput. Physics, 34 (26), a a Based on earlier works [Csiszár 75], [Barron, Sheu 9], [Borwein, Lewis 9] Advantage: Only a few parameters (simple) Drawback: More moments better approximation? (stability?). 5
6 Observe: If the moments µ,..., µ R are consistent, the reconstructed density η is usually not uniquely determined! How to select the most appropriate density? The Maximum Entropy (ME) method: ( Find ρ R = argmax η a) µ k = φ k (x)η(x) dx, b) η(x) and η(x) dx =. ) η(x) ln η(x) dx under constraints: k =,..., R It is least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. [E.T. Jaynes, 957] 6
7 The Maximum Entropy (ME) method: The solution to this problem can be equiv. characterized as ( R ) ρ R exp λ k φ k (x), λ k R. k= where λ,..., λ R satisty the constraints (moment matching): µ k = φ k (x)ρ R (x) dx, k =,..., R, with µ =, φ (x) =. 7
8 In this sense: Entropy maximization moment matching 8
9 Test example: ρ is the log-normal distribution with µ = and σ =.5 and.2 Estimation of moments µ,..., µ R by MC with 8 samples λ = (λ,..., λ R ) determined by the Newton-Raphson method φ k (x)ρ R (λ, x) dx = µ k, k =,... R. Stopping parameters for the Newton-Raphson Method: λ 9 (convergence) λ 3 (no convergence) #iter (no convergence) 9
10 R=,Entropy=.249, λ=7.384e Monomial Moments: Unstable for R 5
11 R=2,Entropy=.8777, λ=8.3457e Monomial Moments: Unstable for R 5
12 R=3,Entropy=.842, λ=8.64e Monomial Moments: Unstable for R 5
13 R=4,Entropy=.7628, λ=6.9344e Monomial Moments: Unstable for R 5
14 .9 R=5,Entropy=2.4849, λ= Monomial Moments: Unstable for R 5
15 R=6,Entropy=2.4849, λ= Monomial Moments: Unstable for R 5
16 R=,Entropy=.249, λ=9.5939e Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
17 .9 R=2,Entropy=.8777, λ=8.2e Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
18 R=3,Entropy=.842, λ=5.724e Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
19 R=4,Entropy=.7628, λ=6.985e Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
20 R=6,Entropy=.73853, λ=2.3278e Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
21 R=7,Entropy=.7354, λ=7.75e Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
22 R=9,Entropy=.7398, λ= Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
23 R=,Entropy=.7467, λ= Legendre Moments: Stable for R 8 Entropy is may decrease even when the Newton Method does not converge
24 R=,Entropy=2.85,.8 λ=7.2644e Fourier Moments: Stable Entropy is monotonously decreasing 2
25 R=2,Entropy=.8924, λ=7.7399e Fourier Moments: Stable Entropy is monotonously decreasing 2
26 R=3,Entropy=.89239, λ=6.5837e Fourier Moments: Stable Entropy is monotonously decreasing 2
27 R=4,Entropy=.76457, λ=8.9486e Fourier Moments: Stable Entropy is monotonously decreasing 2
28 R=5,Entropy=.7393,.8 λ=5.7226e Fourier Moments: Stable Entropy is monotonously decreasing 2
29 R=6,Entropy=.73924,.8 λ=6.765e Fourier Moments: Stable Entropy is monotonously decreasing 2
30 R=8,Entropy=.736, λ=8.4948e Fourier Moments: Stable Entropy is monotonously decreasing 2
31 R=9,Entropy=.72997, λ=9.555e Fourier Moments: Stable Entropy is monotonously decreasing 2
32 R=3,Entropy=.72736, λ=2.9376e Fourier Moments: Stable Entropy is monotonously decreasing 2
33 R=4,Entropy=.7267, λ=5.292e Fourier Moments: Stable Entropy is monotonously decreasing 2
34 R=5,Entropy=.72665, λ=4.7698e Fourier Moments: Stable Entropy is monotonously decreasing 2
35 R=7,Entropy=.72628, λ=7.75e Fourier Moments: Stable Entropy is monotonously decreasing 2
36 R=8,Entropy=.7263, λ=7.966e Fourier Moments: Stable Entropy is monotonously decreasing 2
37 R=9,Entropy=.7268, λ=6.78e Fourier Moments: Stable Entropy is monotonously decreasing 2
38 .8 R=2,Entropy=.726, λ=8.43e Fourier Moments: Stable Entropy is monotonously decreasing 2
39 Breaking convergence for the Fourier basis by choosing a more concentrated density! e.g. log-normal with µ =, σ =.2 skip 3
40 R=,Entropy=2.295, λ=6.432e Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
41 R=2,Entropy=-.684, λ=6.29e Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
42 R=3,Entropy=-.9, λ=4.9765e Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
43 2.5 R=4,Entropy=-.92, λ=7.8862e Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
44 R=5,Entropy=-.92, λ=8.2299e Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
45 R=6,Entropy=-.948, λ= Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
46 2.5 R=8,Entropy=-.954, λ= Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
47 R=9,Entropy=-.95, λ= Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
48 R=3,Entropy=-.955, λ= Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
49 R=4,Entropy=-.955, λ= Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
50 R=5,Entropy=-.955, λ= Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
51 R=6,Entropy=-.955, λ= Fourier Moments (σ =.2, [a, b] = [, 2]): Remain stable even without convergence! Entropy is still monotonously decreasing! 4
52 Regain stability of the Legendre basis by choosing a smaller approximation interval! e.g. [a, b] = [, 4] 5
53 R=,Entropy=.99553, λ=5.2639e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
54 2.5 2 R=2,Entropy=-.65, λ=6.6e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
55 2.5 2 R=3,Entropy=-.7783, λ=5.9556e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
56 2.5 2 R=4,Entropy=-.9, λ=.527e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
57 2.5 R=5,Entropy=-.934, λ=7.46e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
58 2.5 2 R=6,Entropy=-.954, λ=8.576e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
59 2.5 2 R=8,Entropy=-.955, λ=2.455e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
60 2.5 2 R=9,Entropy=-.955, λ=3.6249e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
61 2.5 R=3,Entropy=-.955, λ=2.6296e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
62 2.5 R=4,Entropy=-.955, λ=4.482e Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
63 2.5 R=5,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
64 2.5 R=6,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
65 2.5 R=7,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
66 2.5 R=9,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
67 2.5 R=2,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4]): Still quite stable even without convergence! Entropy is still monotonously decreasing! 6
68 R=,Entropy=.99553, λ=5.2639e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
69 R=2,Entropy=-.65, λ=6.6e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
70 R=3,Entropy=-.7783, λ=5.9556e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
71 R=4,Entropy=-.9, λ=.527e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
72 R=5,Entropy=-.934, λ=7.46e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
73 R=6,Entropy=-.954, λ=8.576e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
74 R=8,Entropy=-.955, λ=2.455e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
75 R=9,Entropy=-.955, λ=3.6249e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
76 R=3,Entropy=-.955, λ=2.6296e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
77 R=4,Entropy=-.955, λ=4.482e Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
78 R=5,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
79 R=6,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
80 R=7,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
81 R=9,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
82 R=2,Entropy=-.955, λ= Legendre Moments (σ =.2, [a, b] = [, 4], semilog): Oscillations in the negative domain stability of the density 7
83 2.5 2 R=2,Entropy=-.675, λ=5.766e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
84 2.5 2 R=3,Entropy=-.834, λ=9.6224e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
85 2.5 2 R=4,Entropy=-.9, λ=5.68e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
86 2.5 2 R=5,Entropy=-.943, λ=5.9985e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
87 2.5 2 R=6,Entropy=-.954, λ=6.864e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
88 2.5 2 R=8,Entropy=-.955, λ=7.785e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
89 2.5 2 R=9,Entropy=-.955, λ=6.53e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
90 2.5 R=3,Entropy=-.955, λ=8.749e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
91 2.5 R=4,Entropy=-.957, λ=6.3789e Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
92 2.5 R=5,Entropy=-.972, λ= Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
93 2.5 R=6,Entropy=-.979, λ= Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
94 2.5 R=7,Entropy=-.962, λ= Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
95 2.5 2 R=9,Entropy=-.9, λ= Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
96 2.5 R=2,Entropy=-.8995, λ= Legendre Moments (σ =.2, [a, b] = [.3, 3]): Is stable and convergent for a bigger range R 4! Entropy is monoton. decr. (stat. accuracy of MC is reached?) 8
97 Observation: µ k are not known exactly, we only have µ k µ k, k =,..., R (approx. moments µ k may be computed e.g. by MC or MLMC). Moment matching with µ k perturbed density { ( R ) } ρ R E := p θ exp θ k φ k (x), θ k R. k= Question: What is the distance between ρ and ρ R? 9
98 Consider the natural metric D KL (p q) := p(x) ln p(x) q(x) dx ( Kullback-Leibler divergence or relative entropy ) Pythagoras theorem D KL (ρ ρ R ) = D KL (ρ ρ R ) + D }{{} KL (ρ R ρ R ). }{{} trunc. error estim. error ρ is the exact density ρ R E is the ME solution for exact moments µ,..., µ R ρ R E is the ME solution for approx. moments µ,..., µ R Truncation error: D KL (ρ ρ R ) when R Estimation error: D KL (ρ R ρ R ) when µ k µ k. Our aim is the rigorous quantification of these statements. 2
99 From D KL to L p -norms: Relation to L p -norms i) 2 ρ η 2 L D KL (ρ η); ii) C p ρ η p L p D KL (ρ η), ρ, η L. Relation to L 2 -norms of the log-density For two pdf s ρ and η with ln(ρ/η) L D KL (ρ η) 2 e ln(ρ/η) L ln(ρ/η) 2 ρ dx and D KL (ρ η) 2 e ln(ρ/η) c L ln(ρ/η) c 2 ρ dx (#) for any c R. 2
100 From D KL to L p -norms: Relation to L p -norms i) 2 ρ η 2 L D KL (ρ η); ii) C p ρ η p L p D KL (ρ η), ρ, η L. Relation to L 2 -norms of the log-density For two pdf s ρ and η with ln(ρ/η) L D KL (ρ η) 2 e ln(ρ/η) L ln(ρ/η) 2 ρ dx and D KL (ρ η) 2 e ln(ρ/η) c L ln(ρ/η) c 2 ρ dx (#) for any c R. 2
101 Truncation error is driven by the smoothness of ρ: polynomial moments span{, φ,..., φ R } = P R polynomial best approximation Conv. of the truncation error D KL (ρ ρ R ) 2 inf v P R { [ exp { ln ρ v L } ln ρ v 2 L 2 (ρ) R 2s, when ln ρ H s, s exp( br), when ln ρ is analytic. ] Proof: Choose η = e v / e v for v P R in (#). 22
102 Truncation error is driven by the smoothness of ρ: polynomial moments span{, φ,..., φ R } = P R polynomial best approximation Conv. of the truncation error D KL (ρ ρ R ) 2 inf v P R { [ exp { ln ρ v L } ln ρ v 2 L 2 (ρ) R 2s, when ln ρ H s, s exp( br), when ln ρ is analytic. ] Proof: Choose η = e v / e v for v P R in (#). 22
103 Estimation error is driven by the perturbation of moments: Conv. of the estimation error (a.s.-version) Suppose µ µ (2A R C R ) where A R, C R are explicit constants, then ρ R exists and there holds D KL (ρ R ρ R ) C R µ µ 2 23
104 Estimation error is driven by the perturbation of moments: Conv. of the estimation error (a.s.-version) Suppose µ µ (2A R C R ) where A R, C R are explicit constants, then ρ R exists and there holds D KL (ρ R ρ R ) C R µ µ 2 A R = max { v L / v L 2 : v P R } and CR = 2 exp { + ln ρ R }. 23
105 Estimation error is driven by the perturbation of moments: Conv. of the estimation error (a.s.-version) Suppose µ µ (2A R C R ) where A R, C R are explicit constants, then ρ R exists and there holds D KL (ρ R ρ R ) C R µ µ 2 Caution: If µ k is computed by MC ρ R may fail to exist (with some prob.) 23
106 Estimation error is driven by the perturbation of moments: Conv. of the estimation error (probabilistic version, simplified) Suppose E µ µ 2 (2A R C R ) 2 with the same A R, C R, then ρ R exists with prob. p (for p [p, ]) and it holds D KL (ρ R ρ R ) C R p E µ µ 2 ( ) 23
107 Estimation error is driven by the perturbation of moments: Conv. of the estimation error (probabilistic version, simplified) Suppose E µ µ 2 (2A R C R ) 2 with the same A R, C R, then ρ R exists with prob. p (for p [p, ]) and it holds D KL (ρ R ρ R ) C R p E µ µ 2 ( ) Here E µ µ 2 = R E µ k µ k 2 }{{} k= standard MSE( µ k ) can be handled by the standard (ML)MC theory 23
108 Structure of the estimate: D KL (ρ ρ R ) = D KL (ρ ρ R ) + D KL (ρ R ρ R ) assuming ln ρ H D KL (ρ ρ R ) polynomial best approx. + C R p assuming R k= MSE( µ k ) { ln ρ H s, s ln ρ is analytic } { D KL (ρ ρ R ) R 2s exp( br) } + C R p R Bias( µ k ) 2 + Var( µ k ) k= C R is unif. bounded when ln ρ H s, s > Error vs. cost theorems balancing the error contributions 24
109 Structure of the estimate: D KL (ρ ρ R ) = D KL (ρ ρ R ) + D KL (ρ R ρ R ) assuming ln ρ H D KL (ρ ρ R ) polynomial best approx. + C R p assuming R k= MSE( µ k ) { ln ρ H s, s ln ρ is analytic } { D KL (ρ ρ R ) R 2s exp( br) } + C R p R Bias( µ k ) 2 + Var( µ k ) k= C R is unif. bounded when ln ρ H s, s > Error vs. cost theorems balancing the error contributions 24
110 Structure of the estimate: D KL (ρ ρ R ) = D KL (ρ ρ R ) + D KL (ρ R ρ R ) assuming ln ρ H D KL (ρ ρ R ) polynomial best approx. + C R p assuming R k= MSE( µ k ) { ln ρ H s, s ln ρ is analytic } { D KL (ρ ρ R ) R 2s exp( br) } + C R p R Bias( µ k ) 2 + Var( µ k ) k= C R is unif. bounded when ln ρ H s, s > Error vs. cost theorems balancing the error contributions 24
111 Structure of the estimate: D KL (ρ ρ R ) = D KL (ρ ρ R ) + D KL (ρ R ρ R ) assuming ln ρ H D KL (ρ ρ R ) polynomial best approx. + C R p assuming R k= MSE( µ k ) { ln ρ H s, s ln ρ is analytic } { D KL (ρ ρ R ) R 2s exp( br) } + C R p R Bias( µ k ) 2 + Var( µ k ) k= C R is unif. bounded when ln ρ H s, s > Error vs. cost theorems balancing the error contributions 24
112 Theorem (Monte Carlo-ME: Accuracy / Cost, simplified) Assume in addition that a) E[(X X l ) 2 ] N β l, b) Cost(X l ) N γ l Then D KL (ρ ρ R ) < ε with probability p and for the single level approximation of Legendre moments: ε β+γ(δ+) 2sβ if ln(ρ) H s, Cost( ρ R ) (pε) β+γ β ln(ε) β+γ(δ+) β multilevel approximation of Legendre moments: Cost( ρ R ) (pε) max(β,γ) β if ln(ρ) is analytic; (δ+) max(β,γ) ε 2sβ if ln(ρ) H s, (δ+) max(β,γ) ln(ε) β if ln(ρ) is analytic. Here δ is such that E[(φ k (X ) φ k (X l )) 2 ] k δ N β l and β γ. 25
113 Contact with rough surfaces ψ(x) Courtesy: Prof. Udo Nackenhorst, IBNM, Univ. Hannover Unknown parameter: ψ(x) is the road surface profile. (irregular microstructure) 26
114 Model: Contact of an elastic membrane with a rough surface (2d) u f, u ψ, ( u + f )(u ψ) =, } in D, u = on D. ( here D = [, ] 2 ) QoI: Membrane deformation u(x); Contact Area Λ(ω) = {x : u(x) = ψ(x)}. 27
115 Example: Rough obstacle models Power spectrum [Persson et al. 5]: ψ(x) = B q (H) cos(q x + ϕ q ) q q q s where B q(h) = π 5 (2π max( q, q l )) H Many materials in Nature and technics obey this law for amplitudes. B q(h), q =, q l =, q s =26 H= H=.5 2 H= H U(, ) random roughness ϕ q U(, 2π) random phase Forward solver: Own implementation of MMG (TNNM) [Kornhuber 94,... ]
116 Obstacle surfaces of variable/random roughness ψ = ψ(x, ω): 29
117 Estimation of the PDF ρ X of the contact area X = Λ by the Maximum Entropy method 4 ME r = The peak(s) corresponds to ca. 28.2% of the membrane in contact with the surface More experiments and rigorous error analysis in [Bierig/Chernov, JCP, 26]. 3
118 Estimation of the PDF ρ X of the contact area X = Λ by the Maximum Entropy method 4 ME r = 6 KDE r = The peak(s) corresponds to ca. 28.2% of the membrane in contact with the surface More experiments and rigorous error analysis in [Bierig/Chernov, JCP, 26]. 3
119 Estimation of the PDF ρ X of the contact area X = Λ by the Maximum Entropy method 38 L2 Projektion R=2, MLMC L=9 ME R=5, MLMC L=9 kde l=4 37 kde l=5 kde l= The peak(s) corresponds to ca. 28.2% of the membrane in contact with the surface More experiments and rigorous error analysis in [Bierig/Chernov, JCP, 26]. 3
120 Summary: Rigorous convergence analysis for the MLMC-Maximum Entropy Method for compactly suported densities Num. experiments: smoothness assumptions may be relaxed Open issue: how to select the number of moment constraints R in practical computations (in the pre-asymptotic regime)? Adaptivity? With appropriate R the method is able to produce good approximations for a broad class of densities. Thank you for your attention! 3
121 Summary: Rigorous convergence analysis for the MLMC-Maximum Entropy Method for compactly suported densities Num. experiments: smoothness assumptions may be relaxed Open issue: how to select the number of moment constraints R in practical computations (in the pre-asymptotic regime)? Adaptivity? With appropriate R the method is able to produce good approximations for a broad class of densities. Thank you for your attention! 3
122 Summary: Rigorous convergence analysis for the MLMC-Maximum Entropy Method for compactly suported densities Num. experiments: smoothness assumptions may be relaxed Open issue: how to select the number of moment constraints R in practical computations (in the pre-asymptotic regime)? Adaptivity? With appropriate R the method is able to produce good approximations for a broad class of densities. Thank you for your attention! 3
123 Summary: Rigorous convergence analysis for the MLMC-Maximum Entropy Method for compactly suported densities Num. experiments: smoothness assumptions may be relaxed Open issue: how to select the number of moment constraints R in practical computations (in the pre-asymptotic regime)? Adaptivity? With appropriate R the method is able to produce good approximations for a broad class of densities. Thank you for your attention! 3
124 Summary: Rigorous convergence analysis for the MLMC-Maximum Entropy Method for compactly suported densities Num. experiments: smoothness assumptions may be relaxed Open issue: how to select the number of moment constraints R in practical computations (in the pre-asymptotic regime)? Adaptivity? With appropriate R the method is able to produce good approximations for a broad class of densities. Thank you for your attention! 3
125 References: C. Bierig and A. Chernov, Approximation of probability density functions by the Multilevel Monte Carlo Maximum Entropy method, J. Comput. Physics, 34 (26), M. B. Giles, T. Nagapetyan, and K. Ritter, Multilevel Monte Carlo Approximation of Distribution Functions and Densities, SIAM/ASA J. Uncert. Quantif., 3 (25), pp
126 Exact synthetic pdf s ρ,..., ρ 5 33 Numerical examples for synthetic pdf s of different smoothness: 2.5 ρ ρ 2 ρ 3 ρ 4 ρ
127 Numerical examples for synthetic pdf s of different smoothness: Maximum Entropy KDE r = ln ρ is analytic: smoothness assumptions are satisfied Best maximum entropy approximation over R =,..., 6 moments 33
128 Numerical examples for synthetic pdf s of different smoothness: 2 MISE Maximum Entropy KDE r = KDE r = 2 KDE r = 3 KDE r = 4 KDE r = Cost ln ρ is analytic: smoothness assumptions are satisfied Best maximum entropy approximation over R =,..., 6 moments 33
129 Numerical examples for synthetic pdf s of different smoothness: Maximum Entropy KDE r = ln ρ 2 H /2 ɛ (, ): smoothness assumptions are not satisfied Best maximum entropy approximation over R =,..., 6 moments 33
130 Numerical examples for synthetic pdf s of different smoothness: Maximum Entropy KDE r = KDE r = 2 KDE r = 3 KDE r = 4 KDE r = 5 MISE Cost ln ρ 2 H /2 ɛ (, ): smoothness assumptions are not satisfied Best maximum entropy approximation over R =,..., 6 moments 33
131 Numerical examples for synthetic pdf s of different smoothness: 2 Maximum Entropy KDE r = ln ρ 5 is unbounded: smoothness assumptions are not satisfied Best maximum entropy approximation over R =,..., 6 moments 33
132 Numerical examples for synthetic pdf s of different smoothness: MISE 2 3 Maximum Entropy KDE r = KDE r = 2 KDE r = 3 KDE r = 4 KDE r = Cost ln ρ 5 is unbounded: smoothness assumptions are not satisfied Best maximum entropy approximation over R =,..., 6 moments 33
133 Estimation of the PDF ρ X of the contact area X = Λ by the Maximum Entropy method 4 ME r = The peak(s) corresponds to ca. 28.2% of the membrane in contact with the surface More experiments and rigorous error analysis in [Bierig/Chernov, JCP, 26]. 34
134 Estimation of the PDF ρ X of the contact area X = Λ by the Maximum Entropy method 4 ME r = 6 KDE r = The peak(s) corresponds to ca. 28.2% of the membrane in contact with the surface More experiments and rigorous error analysis in [Bierig/Chernov, JCP, 26]. 34
Multi-level Monte Carlo methods in Uncertainty Quantification
Multi-level Monte Carlo methods in Uncertainty Quantification Fabio Nobile CSQI - Institute of Mathematics, EPFL, Switzerland Acknowlegments: M. Pisaroni, S. Krumscheid, P. Leyland (EPFL), A.L. Haji-Ali
More informationCombining Interval and Probabilistic Uncertainty in Engineering Applications
Combining Interval and Probabilistic Uncertainty in Engineering Applications Andrew Pownuk Computational Science Program University of Texas at El Paso El Paso, Texas 79968, USA ampownuk@utep.edu Page
More informationPhenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012
Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM
More informationRandom Matrix Eigenvalue Problems in Probabilistic Structural Mechanics
Random Matrix Eigenvalue Problems in Probabilistic Structural Mechanics S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html
More informationStochastic Collocation Methods for Polynomial Chaos: Analysis and Applications
Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535
More informationQuantifying Stochastic Model Errors via Robust Optimization
Quantifying Stochastic Model Errors via Robust Optimization IPAM Workshop on Uncertainty Quantification for Multiscale Stochastic Systems and Applications Jan 19, 2016 Henry Lam Industrial & Operations
More informationStochastic Spectral Approaches to Bayesian Inference
Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationMultilevel Monte Carlo for Lévy Driven SDEs
Multilevel Monte Carlo for Lévy Driven SDEs Felix Heidenreich TU Kaiserslautern AG Computational Stochastics August 2011 joint work with Steffen Dereich Philipps-Universität Marburg supported within DFG-SPP
More informationMultilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems
Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November
More informationarxiv: v2 [math.na] 6 Aug 2018
Bayesian model calibration with interpolating polynomials based on adaptively weighted Leja nodes L.M.M. van den Bos,2, B. Sanderse, W.A.A.M. Bierbooms 2, and G.J.W. van Bussel 2 arxiv:82.235v2 [math.na]
More informationMultilevel stochastic collocations with dimensionality reduction
Multilevel stochastic collocations with dimensionality reduction Ionut Farcas TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017 Outline 1 Motivation 2 Theoretical background Uncertainty
More informationRandom Eigenvalue Problems Revisited
Random Eigenvalue Problems Revisited S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. Email: S.Adhikari@bristol.ac.uk URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html
More informationExercise 5 Release: Due:
Stochastic Modeling and Simulation Winter 28 Prof. Dr. I. F. Sbalzarini, Dr. Christoph Zechner (MPI-CBG/CSBD TU Dresden, 87 Dresden, Germany Exercise 5 Release: 8..28 Due: 5..28 Question : Variance of
More informationEstimating functional uncertainty using polynomial chaos and adjoint equations
0. Estimating functional uncertainty using polynomial chaos and adjoint equations February 24, 2011 1 Florida State University, Tallahassee, Florida, Usa 2 Moscow Institute of Physics and Technology, Moscow,
More informationPolynomial chaos expansions for sensitivity analysis
c DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Polynomial chaos expansions for sensitivity analysis B. Sudret Chair of Risk, Safety & Uncertainty
More informationAn exponentially graded mesh for singularly perturbed problems
An exponentially graded mesh for singularly perturbed problems Christos Xenophontos Department of Mathematics and Statistics University of Cyprus joint work with P. Constantinou (UCY), S. Franz (TU Dresden)
More informationCovariance Matrix Simplification For Efficient Uncertainty Management
PASEO MaxEnt 2007 Covariance Matrix Simplification For Efficient Uncertainty Management André Jalobeanu, Jorge A. Gutiérrez PASEO Research Group LSIIT (CNRS/ Univ. Strasbourg) - Illkirch, France *part
More informationToday: 5 July 2008 ٢
Anderson localization M. Reza Rahimi Tabar IPM 5 July 2008 ١ Today: 5 July 2008 ٢ Short History of Anderson Localization ٣ Publication 1) F. Shahbazi, etal. Phys. Rev. Lett. 94, 165505 (2005) 2) A. Esmailpour,
More informationEconometrics I, Estimation
Econometrics I, Estimation Department of Economics Stanford University September, 2008 Part I Parameter, Estimator, Estimate A parametric is a feature of the population. An estimator is a function of the
More informationFrom a Mesoscopic to a Macroscopic Description of Fluid-Particle Interaction
From a Mesoscopic to a Macroscopic Description of Fluid-Particle Interaction Carnegie Mellon University Center for Nonlinear Analysis Working Group, October 2016 Outline 1 Physical Framework 2 3 Free Energy
More informationarxiv: v1 [cs.it] 30 Jun 2017
1 How biased is your model? Concentration Inequalities, Information and Model Bias Konstantinos Gourgoulias, Markos A. Katsoulakis, Luc Rey-Bellet and Jie Wang Abstract arxiv:1706.10260v1 [cs.it] 30 Jun
More informationSparse polynomial chaos expansions in engineering applications
DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Sparse polynomial chaos expansions in engineering applications B. Sudret G. Blatman (EDF R&D,
More informationNumerical explorations of a forward-backward diffusion equation
Numerical explorations of a forward-backward diffusion equation Newton Institute KIT Programme Pauline Lafitte 1 C. Mascia 2 1 SIMPAF - INRIA & U. Lille 1, France 2 Univ. La Sapienza, Roma, Italy September
More informationA Stochastic Collocation based. for Data Assimilation
A Stochastic Collocation based Kalman Filter (SCKF) for Data Assimilation Lingzao Zeng and Dongxiao Zhang University of Southern California August 11, 2009 Los Angeles Outline Introduction SCKF Algorithm
More informationAdaptive Collocation with Kernel Density Estimation
Examples of with Kernel Density Estimation Howard C. Elman Department of Computer Science University of Maryland at College Park Christopher W. Miller Applied Mathematics and Scientific Computing Program
More informationTECHNISCHE UNIVERSITÄT MÜNCHEN. Uncertainty Quantification in Fluid Flows via Polynomial Chaos Methodologies
TECHNISCHE UNIVERSITÄT MÜNCHEN Bachelor s Thesis in Engineering Science Uncertainty Quantification in Fluid Flows via Polynomial Chaos Methodologies Jan Sültemeyer DEPARTMENT OF INFORMATICS MUNICH SCHOOL
More informationOblique derivative problems for elliptic and parabolic equations, Lecture II
of the for elliptic and parabolic equations, Lecture II Iowa State University July 22, 2011 of the 1 2 of the of the As a preliminary step in our further we now look at a special situation for elliptic.
More informationPolynomial chaos expansions for structural reliability analysis
DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Polynomial chaos expansions for structural reliability analysis B. Sudret & S. Marelli Incl.
More informationThe Expectation Maximization or EM algorithm
The Expectation Maximization or EM algorithm Carl Edward Rasmussen November 15th, 2017 Carl Edward Rasmussen The EM algorithm November 15th, 2017 1 / 11 Contents notation, objective the lower bound functional,
More informationCombining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models
Combining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models Benjamin Peherstorfer a,, Boris Kramer a, Karen Willcox a a Department of Aeronautics
More informationInformation geometry for bivariate distribution control
Information geometry for bivariate distribution control C.T.J.Dodson + Hong Wang Mathematics + Control Systems Centre, University of Manchester Institute of Science and Technology Optimal control of stochastic
More informationFast Numerical Methods for Stochastic Computations
Fast AreviewbyDongbinXiu May 16 th,2013 Outline Motivation 1 Motivation 2 3 4 5 Example: Burgers Equation Let us consider the Burger s equation: u t + uu x = νu xx, x [ 1, 1] u( 1) =1 u(1) = 1 Example:
More informationRenormalization Group: non perturbative aspects and applications in statistical and solid state physics.
Renormalization Group: non perturbative aspects and applications in statistical and solid state physics. Bertrand Delamotte Saclay, march 3, 2009 Introduction Field theory: - infinitely many degrees of
More informationSpatially Smoothed Kernel Density Estimation via Generalized Empirical Likelihood
Spatially Smoothed Kernel Density Estimation via Generalized Empirical Likelihood Kuangyu Wen & Ximing Wu Texas A&M University Info-Metrics Institute Conference: Recent Innovations in Info-Metrics October
More informationA High Order Conservative Semi-Lagrangian Discontinuous Galerkin Method for Two-Dimensional Transport Simulations
Motivation Numerical methods Numerical tests Conclusions A High Order Conservative Semi-Lagrangian Discontinuous Galerkin Method for Two-Dimensional Transport Simulations Xiaofeng Cai Department of Mathematics
More informationA recovery-assisted DG code for the compressible Navier-Stokes equations
A recovery-assisted DG code for the compressible Navier-Stokes equations January 6 th, 217 5 th International Workshop on High-Order CFD Methods Kissimmee, Florida Philip E. Johnson & Eric Johnsen Scientific
More informationSimultaneous Robust Design and Tolerancing of Compressor Blades
Simultaneous Robust Design and Tolerancing of Compressor Blades Eric Dow Aerospace Computational Design Laboratory Department of Aeronautics and Astronautics Massachusetts Institute of Technology GTL Seminar
More informationMonte Carlo Methods for Uncertainty Quantification
Monte Carlo Methods for Uncertainty Quantification Mike Giles Mathematical Institute, University of Oxford Contemporary Numerical Techniques Mike Giles (Oxford) Monte Carlo methods 1 / 23 Lecture outline
More informationRobust exponential convergence of hp-fem for singularly perturbed systems of reaction-diffusion equations
Robust exponential convergence of hp-fem for singularly perturbed systems of reaction-diffusion equations Christos Xenophontos Department of Mathematics and Statistics University of Cyprus joint work with
More informationOn Multigrid for Phase Field
On Multigrid for Phase Field Carsten Gräser (FU Berlin), Ralf Kornhuber (FU Berlin), Rolf Krause (Uni Bonn), and Vanessa Styles (University of Sussex) Interphase 04 Rome, September, 13-16, 2004 Synopsis
More informationQuarkonial frames of wavelet type - Stability, approximation and compression properties
Quarkonial frames of wavelet type - Stability, approximation and compression properties Stephan Dahlke 1 Peter Oswald 2 Thorsten Raasch 3 ESI Workshop Wavelet methods in scientific computing Vienna, November
More informationProblem set 2 The central limit theorem.
Problem set 2 The central limit theorem. Math 22a September 6, 204 Due Sept. 23 The purpose of this problem set is to walk through the proof of the central limit theorem of probability theory. Roughly
More informationBACKGROUNDS. Two Models of Deformable Body. Distinct Element Method (DEM)
BACKGROUNDS Two Models of Deformable Body continuum rigid-body spring deformation expressed in terms of field variables assembly of rigid-bodies connected by spring Distinct Element Method (DEM) simple
More informationCovariance function estimation in Gaussian process regression
Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian
More informationPerformance Evaluation of Generalized Polynomial Chaos
Performance Evaluation of Generalized Polynomial Chaos Dongbin Xiu, Didier Lucor, C.-H. Su, and George Em Karniadakis 1 Division of Applied Mathematics, Brown University, Providence, RI 02912, USA, gk@dam.brown.edu
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationAccounting for Hyperparameter Uncertainty in SAE Based on a State-Space Model: the Dutch Labour Force Survey Oksana Bollineni-Balabay, Jan van den
Accounting for Hyperparameter Uncertainty in SAE Based on a State-Space Model: the Dutch Labour Force Survey Oksana Bollineni-Balabay, Jan van den Brakel, Franz Palm The Dutch LFS monthly estimates for
More informationAn Alternative Method for Estimating and Simulating Maximum Entropy Densities
An Alternative Method for Estimating and Simulating Maximum Entropy Densities Jae-Young Kim and Joonhwan Lee Seoul National University May, 8 Abstract This paper proposes a method of estimating and simulating
More informationProbing the covariance matrix
Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241
More informationChapter 9: Basic of Hypercontractivity
Analysis of Boolean Functions Prof. Ryan O Donnell Chapter 9: Basic of Hypercontractivity Date: May 6, 2017 Student: Chi-Ning Chou Index Problem Progress 1 Exercise 9.3 (Tightness of Bonami Lemma) 2/2
More informationThermalization of Random Motion in Weakly Confining Potentials
Open Systems & Information Dynamics Vol. 17, No. 3 (2010) 1 10 DOI: xxx c World Scientific Publishing Company Thermalization of Random Motion in Weakly Confining Potentials Piotr Garbaczewski and Vladimir
More informationStability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan
Stability and Sensitivity of the Capacity in Continuous Channels Malcolm Egan Univ. Lyon, INSA Lyon, INRIA 2019 European School of Information Theory April 18, 2019 1 / 40 Capacity of Additive Noise Models
More informationThe circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA)
The circular law Lewis Memorial Lecture / DIMACS minicourse March 19, 2008 Terence Tao (UCLA) 1 Eigenvalue distributions Let M = (a ij ) 1 i n;1 j n be a square matrix. Then one has n (generalised) eigenvalues
More informationGeneral Power Series
General Power Series James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University March 29, 2018 Outline Power Series Consequences With all these preliminaries
More informationA Training-time Analysis of Robustness in Feed-Forward Neural Networks
A Training-time Analysis of Robustness in Feed-Forward Neural Networks Cesare Alippi Dipartimento di Elettronica e Informazione Politecnico di Milano Milano, Italy E-mail: alippi@elet.polimi Daniele Sana,
More informationToward Approximate Moving Least Squares Approximation with Irregularly Spaced Centers
Toward Approximate Moving Least Squares Approximation with Irregularly Spaced Centers Gregory E. Fasshauer Department of Applied Mathematics, Illinois Institute of Technology, Chicago, IL 6066, U.S.A.
More informationFinding the best mismatched detector for channel coding and hypothesis testing
Finding the best mismatched detector for channel coding and hypothesis testing Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory
More informationUncertainty representation and propagation in chemical networks
Uncertainty representation and propagation in chemical networks P. Pernot Laboratoire de Chimie Physique, Orsay Plan 1 Introduction 2 Uncertainty propagation 3 Treatment of uncertainties for chemical rates
More informationVariational Inference via Stochastic Backpropagation
Variational Inference via Stochastic Backpropagation Kai Fan February 27, 2016 Preliminaries Stochastic Backpropagation Variational Auto-Encoding Related Work Summary Outline Preliminaries Stochastic Backpropagation
More informationApproximation of BSDEs using least-squares regression and Malliavin weights
Approximation of BSDEs using least-squares regression and Malliavin weights Plamen Turkedjiev (turkedji@math.hu-berlin.de) 3rd July, 2012 Joint work with Prof. Emmanuel Gobet (E cole Polytechnique) Plamen
More informationLocal Density Approximation for the Almost-bosonic Anyon Gas. Michele Correggi
Local Density Approximation for the Almost-bosonic Anyon Gas Michele Correggi Università degli Studi Roma Tre www.cond-math.it QMATH13 Many-body Systems and Statistical Mechanics joint work with D. Lundholm
More informationAnton ARNOLD. with N. Ben Abdallah (Toulouse), J. Geier (Vienna), C. Negulescu (Marseille) TU Vienna Institute for Analysis and Scientific Computing
TECHNISCHE UNIVERSITÄT WIEN Asymptotically correct finite difference schemes for highly oscillatory ODEs Anton ARNOLD with N. Ben Abdallah (Toulouse, J. Geier (Vienna, C. Negulescu (Marseille TU Vienna
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationDepartment of Applied Mathematics and Theoretical Physics. AMA 204 Numerical analysis. Exam Winter 2004
Department of Applied Mathematics and Theoretical Physics AMA 204 Numerical analysis Exam Winter 2004 The best six answers will be credited All questions carry equal marks Answer all parts of each question
More informationOn the Boltzmann equation: global solutions in one spatial dimension
On the Boltzmann equation: global solutions in one spatial dimension Department of Mathematics & Statistics Colloque de mathématiques de Montréal Centre de Recherches Mathématiques November 11, 2005 Collaborators
More informationUncertainty. Jayakrishnan Unnikrishnan. CSL June PhD Defense ECE Department
Decision-Making under Statistical Uncertainty Jayakrishnan Unnikrishnan PhD Defense ECE Department University of Illinois at Urbana-Champaign CSL 141 12 June 2010 Statistical Decision-Making Relevant in
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationBickel Rosenblatt test
University of Latvia 28.05.2011. A classical Let X 1,..., X n be i.i.d. random variables with a continuous probability density function f. Consider a simple hypothesis H 0 : f = f 0 with a significance
More informationTwo new developments in Multilevel Monte Carlo methods
Two new developments in Multilevel Monte Carlo methods Mike Giles Abdul-Lateef Haji-Ali Oliver Sheridan-Methven Mathematical Institute, University of Oxford Rob Scheichl s Farewell Conference University
More informationMultilevel Monte Carlo methods for highly heterogeneous media
Multilevel Monte Carlo methods for highly heterogeneous media Aretha L. Teckentrup University of Bath Claverton Down Bath, BA2 7AY, UK Abstract We discuss the application of multilevel Monte Carlo methods
More informationCS Lecture 19. Exponential Families & Expectation Propagation
CS 6347 Lecture 19 Exponential Families & Expectation Propagation Discrete State Spaces We have been focusing on the case of MRFs over discrete state spaces Probability distributions over discrete spaces
More informationInterpolation via weighted l 1 -minimization
Interpolation via weighted l 1 -minimization Holger Rauhut RWTH Aachen University Lehrstuhl C für Mathematik (Analysis) Matheon Workshop Compressive Sensing and Its Applications TU Berlin, December 11,
More informationMinimax lower bounds I
Minimax lower bounds I Kyoung Hee Kim Sungshin University 1 Preliminaries 2 General strategy 3 Le Cam, 1973 4 Assouad, 1983 5 Appendix Setting Family of probability measures {P θ : θ Θ} on a sigma field
More informationMultilevel Monte Carlo for uncertainty quantification in structural engineering
Multilevel Monte Carlo for uncertainty quantification in structural engineering P. Blondeel a,, P. Robbe a, C. Van hoorickx b, G. Lombaert b, S. Vandewalle a a KU Leuven, Department of Computer Science,
More informationIntegrating Correlated Bayesian Networks Using Maximum Entropy
Applied Mathematical Sciences, Vol. 5, 2011, no. 48, 2361-2371 Integrating Correlated Bayesian Networks Using Maximum Entropy Kenneth D. Jarman Pacific Northwest National Laboratory PO Box 999, MSIN K7-90
More informationGlobally convergent algorithms for PARAFAC with semi-definite programming
Globally convergent algorithms for PARAFAC with semi-definite programming Lek-Heng Lim 6th ERCIM Workshop on Matrix Computations and Statistics Copenhagen, Denmark April 1 3, 2005 Thanks: Vin de Silva,
More informationPlausible Values for Latent Variables Using Mplus
Plausible Values for Latent Variables Using Mplus Tihomir Asparouhov and Bengt Muthén August 21, 2010 1 1 Introduction Plausible values are imputed values for latent variables. All latent variables can
More informationOn the Bennett-Hoeffding inequality
On the Bennett-Hoeffding inequality of Iosif 1,2,3 1 Department of Mathematical Sciences Michigan Technological University 2 Supported by NSF grant DMS-0805946 3 Paper available at http://arxiv.org/abs/0902.4058
More informationUncertainty Quantification in Computational Science
DTU 2010 - Lecture I Uncertainty Quantification in Computational Science Jan S Hesthaven Brown University Jan.Hesthaven@Brown.edu Objective of lectures The main objective of these lectures are To offer
More informationERROR AND UNCERTAINTY QUANTIFICATION AND SENSITIVITY ANALYSIS IN MECHANICS COMPUTATIONAL MODELS
International Journal for Uncertainty Quantification, 1 (2): 147 161 (2011) ERROR AND UNCERTAINTY QUANTIFICATION AND SENSITIVITY ANALYSIS IN MECHANICS COMPUTATIONAL MODELS Bin Liang & Sankaran Mahadevan
More informationCVaR and Examples of Deviation Risk Measures
CVaR and Examples of Deviation Risk Measures Jakub Černý Department of Probability and Mathematical Statistics Stochastic Modelling in Economics and Finance November 10, 2014 1 / 25 Contents CVaR - Dual
More informationPoisson INAR processes with serial and seasonal correlation
Poisson INAR processes with serial and seasonal correlation Márton Ispány University of Debrecen, Faculty of Informatics Joint result with Marcelo Bourguignon, Klaus L. P. Vasconcellos, and Valdério A.
More informationA MULTISCALE APPROACH IN TOPOLOGY OPTIMIZATION
1 A MULTISCALE APPROACH IN TOPOLOGY OPTIMIZATION Grégoire ALLAIRE CMAP, Ecole Polytechnique The most recent results were obtained in collaboration with F. de Gournay, F. Jouve, O. Pantz, A.-M. Toader.
More informationModelling and numerical methods for the diffusion of impurities in a gas
INERNAIONAL JOURNAL FOR NUMERICAL MEHODS IN FLUIDS Int. J. Numer. Meth. Fluids 6; : 6 [Version: /9/8 v.] Modelling and numerical methods for the diffusion of impurities in a gas E. Ferrari, L. Pareschi
More informationElectromagnetic Relaxation Time Distribution Inverse Problems in the Time-domain
Electromagnetic Relaxation Time Distribution Inverse Problems in the Time-domain Prof Nathan L Gibson Department of Mathematics Joint Math Meeting Jan 9, 2011 Prof Gibson (OSU) Inverse Problems for Distributions
More informationCOURSE Numerical integration of functions
COURSE 6 3. Numerical integration of functions The need: for evaluating definite integrals of functions that has no explicit antiderivatives or whose antiderivatives are not easy to obtain. Let f : [a,
More informationOrganization. I MCMC discussion. I project talks. I Lecture.
Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems
More informationBayesian Inverse problem, Data assimilation and Localization
Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is
More informationRandom Averaging. Eli Ben-Naim Los Alamos National Laboratory. Paul Krapivsky (Boston University) John Machta (University of Massachusetts)
Random Averaging Eli Ben-Naim Los Alamos National Laboratory Paul Krapivsky (Boston University) John Machta (University of Massachusetts) Talk, papers available from: http://cnls.lanl.gov/~ebn Plan I.
More informationQuasi-optimal and adaptive sparse grids with control variates for PDEs with random diffusion coefficient
Quasi-optimal and adaptive sparse grids with control variates for PDEs with random diffusion coefficient F. Nobile, L. Tamellini, R. Tempone, F. Tesei CSQI - MATHICSE, EPFL, Switzerland Dipartimento di
More informationComputer simulation of multiscale problems
Progress in the SSF project CutFEM, Geometry, and Optimal design Computer simulation of multiscale problems Axel Målqvist and Daniel Elfverson University of Gothenburg and Uppsala University Umeå 2015-05-20
More informationThe Dirichlet boundary problems for second order parabolic operators satisfying a Carleson condition
The Dirichlet boundary problems for second order parabolic operators satisfying a Carleson condition Sukjung Hwang CMAC, Yonsei University Collaboration with M. Dindos and M. Mitrea The 1st Meeting of
More informationInformation Theoretic Asymptotic Approximations for Distributions of Statistics
Information Theoretic Asymptotic Approximations for Distributions of Statistics Ximing Wu Department of Agricultural Economics Texas A&M University Suojin Wang Department of Statistics Texas A&M University
More informationOn the coherence barrier and analogue problems in compressed sensing
On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)
More informationStat 451 Lecture Notes Monte Carlo Integration
Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:
More informationPhysics 403. Segev BenZvi. Propagation of Uncertainties. Department of Physics and Astronomy University of Rochester
Physics 403 Propagation of Uncertainties Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Maximum Likelihood and Minimum Least Squares Uncertainty Intervals
More informationNumerical Analysis Comprehensive Exam Questions
Numerical Analysis Comprehensive Exam Questions 1. Let f(x) = (x α) m g(x) where m is an integer and g(x) C (R), g(α). Write down the Newton s method for finding the root α of f(x), and study the order
More informationEstimating Accuracy in Classical Molecular Simulation
Estimating Accuracy in Classical Molecular Simulation University of Illinois Urbana-Champaign Department of Computer Science Institute for Mathematics and its Applications July 2007 Acknowledgements Ben
More informationKadath: a spectral solver and its application to black hole spacetimes
Kadath: a spectral solver and its application to black hole spacetimes Philippe Grandclément Laboratoire de l Univers et de ses Théories (LUTH) CNRS / Observatoire de Paris F-92195 Meudon, France philippe.grandclement@obspm.fr
More information