Sensitivity analysis by adjoint Automatic Differentiation and Application Anca Belme Massimiliano Martinelli Laurent Hascoët Valérie Pascual Alain Dervieux INRIA Sophia-Antipolis Project TROPICS Alain.Dervieux@inria.fr ERCOFTAC Course on Uncertainty Management and Quantification in Industrial Analysis and Design, Munich, 3-4 mars 2011 1 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Scope of the talk 1. Perturbation methods for uncertainties. 2. What is Automatic Differentiation. 3. Computing derivatives. 4. An application to CFD. 2 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
1. Perturbation methods for uncertainties The uncertainty management problem Consider a smooth functional j(c) = J(c, W(c)) c R n is the set of parameters W(c) R N is solution of the state equation Ψ(c,W) = 0, we assume that n N suppose now c Ω is a random parameter with a known probabilist law then j(c Ω ) is a random variable, the law of which we want to evaluate The Taylor-based perturbation approach The idea is to rely on the derivatives of j: j(c 0 + δc) = j(c 0 )+j (c 0 ) δc+ 1 2 j (c 0 ) δc δc+... 3 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Why second derivatives?(1) Perturbative methods for uncertainty propagation (Taylor expansion-based) Method of Moments: µ c mean of c Ω, σ 2 i variance of c Ω,i µ j mean of j(c Ω ), σ 2 j variance of j(c Ω ) µ j j(µ c )+ 1 2 i H ii σ 2 i σj 2 G 2 i σi 2 + 1 i 2 H 2 ikσi 2 σk 2 i,k 4 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Why second derivatives?(2) Perturbative methods for uncertainty propagation (Taylor expansion-based) Reduced order Taylor model j reduced (c 0 + δc) = j(c 0 )+j (c 0 ) δc+ 1 2 j (c 0 ) δc δc Evaluate the j reduced (c Ω ) by any sorting (Monte-Carlo,...) 5 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Why second derivatives(3) Robust optimization Gradient-based methods for j robust (c) = j(c)+ε dj dc Gradient-free methods for j robust (c) = j(c)+ 1 2 i H ii σ 2 i Gradient-free methods for j robust (c) = j(c)+kσ 2 j Optimisation of adjoint-corrected functionals Gradient-based methods for j corr (c) = j(c) Ψ ex (c,w),π 6 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
2. What is Automatic Differentiation A computer code P can be considered as evaluating a function f depending of data X. PROGRAM P : a = bx + c Automatic Differentiation is a technique to evaluate analytically derivatives of the function f defined by a computer program P. PROGRAM P : ȧ = ḃx + bẋ + ċ To this end, Automatic Differentiation transforms P in a new program P, either before compiling, this is source transformation, or at compiling step (operator overloading). 7 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
2. Automatic Differentiation Direct/Tangent mode Tangent differentiated code computes f (X).Ẋ f(x) = f p f p 1... f 1 Ẏ = f (x).ẋ = f p(z p 1 ).f p 1(Z p 2 )......f 1(Z 0 ).Ẋ Each instruction is replaced by: - itself Z k 1 Z k = f k (Z k 1 ) - its derivative Z k 1 f k 1 (Z k 1) Computational cost factor: α T 4 Does not store intermediate variables Z 0 = X ; Z k = f k (Z k 1 ) 8 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
2. Automatic Differentiation Backward/Reverse mode Reverse differentiated code computes (f (X)) t.δj f(x) = f p f p 1... f 1 (X) X = f t 1(Z 0 ).f t 2(Z 1 ).f t 2(Z 1 )......f t p 1(Z p 2 ).f t p(z p 1 ).Ȳ Compute forward Z k = f k (Z k 1 ) s and store, then compute backward X. Stores intermediate variables. Computational cost factor: α R 5 9 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
2. Automatic Differentiation Reverse mode, when to use it f : R n R m Tangent mode delivers f (X).Ẋ, i.e. m scalar answers. Reverse mode delibers f t (X).Ȳ i.e. n scalar answers. For n m reverse is about n/m more efficient than tangent. In our case, m = 1 and n > 1. Reverse mode, analogy with adjoint state method In case of a functional depending on a state variable, the reverse mode automatically introduces a subset of f t s which constitutes the adjoint state. Reverse mode, case of a functional with state system The reverse code solves state and adjoint, and produces n scalar answers. The tangent code solves a perturbative state and produces 1 scalar answer. Divided diffferences also produce 1 scalar answer, but in less robust conditions due to the small parameter to fix. 10 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
The TAPENADE program differentiator TAPENADE (115 000 lines of JAVA) transforms a FORTRAN, F95, ANSI C program into a FORTRAN, F95, ANSI C program which computes the derivative of the given FORTRAN, F95, ANSI C program. TAPENADE is organised around an intermediate step, the analysis and differentation of an abstract imperative language. http://www-sop.inria.fr/tropics/ 11 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
TAPENADE on-line 12 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
2. Automatic Differentiation The TAPENADE program differentiator, recent advances TAPENADE V3 takes into account user directives inserted into sources. Directives are used for optimising storage for reverse mode and are a memory of this optimisation available for new code versions. Directive II-LOOP allows for a more efficient reverse differentiation of a frequent class of parallel loops (ex. assembly loops). Directive NOCHECKPOINT can be used for optimal checkpointing in reverse differentiation. TAPENADE lets the user specify finely which procedure calls must be checkpointed or not (ex. time advancing loop). TAPENADE is optimised for successive reverse differentiations. 13 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
3. Computating of derivatives of a state-dependant functional The differentiation problem Consider a computer program computing the functional j(c) = J(c, W(c)) c R n is the set of parameters W(c) R N is solution of the state equation Ψ(c,W) = 0 W(c) is a steady-state solution obtained by explicit or implicit (i.e. with a linearised system) pseudo-time advancing techniques we want, using TAPENADE, to get a computer program computing the second derivatives d 2 j dc 2 = (H ii) 14 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Flow solver: basic algorithm Initialize c,w 0 i = i+1 Compute Ψ i = Ψ(c,W i ) Compute δw i = F(Ψ i ) (implicit or explicit) functional(j,c) Update W i+1 = W i + δw i False Test Ψ i < ε True Compute j = J(c,W itrue ) 15 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Differentiability of a fixed point For a given ε > 0, or large: unsteady Due to the test ending the loop, the loop may need itrue iterations for a parameter c and itrue+1 (or itrue 1) iterations for a c+δc extremely close to c. This shows that the above discrete functional j is only piecewise continuous or differentiable. Further, state W itrue depends on initial conditions of the solution algorithm, in a similar manner to the solution of an unsteady system. The whole algorithm should be differentiated and in reverse case, intermediate (unconverged) values of state W should be stored for backward adjoint solution. Lastly, this option assumes we differentiate the solution algorithm. Assuming ε = 0, or small: fixed point Now, by the implicit function theorem, functional W and j are differentiable. State W(c), unique solution of the implicit system, does not depend on initial conditions of solution algorithm (if convergent and sufficiently converged). In tangent case, the perturbative system should be sufficiently converged. In reverse case, converged solution of state is used in all iterations of adjoint, then we converge sufficiently adjoint. This option is chosen. 16 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Do not differentiate a sophisticated solution algorithm Initialize c,w 0 i = i+1 Compute Ψ i = Ψ(c,W i ) Compute δw i = F(Ψ i ) (implicit or explicit) functional(j,c) Update W i+1 = W i + δw i False Test Ψ i < ε True Compute j = J(c,W ) 17 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Separate system assembly and solution algorithm Initialize c,w 0 stateres(psi,c,w) i = i+1 Compute Ψ i = Ψ(c,W i ) Compute δw i = F(Ψ i ) (implicit or explicit) Update W i+1 = W i + δw i False Test Ψ i < ε True Compute j = J(c,W ) func(j,c,w) 18 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Non-differentiated matrix-free iterative solver Assume we use a matrix-free fixed-point (FP) method (ex.: GMRES) for solving a linearised state systemax = b Keep not-differentiated the FP algorithm The FP algorithm for derivatives (state Gateaux-derivative, adjoint state) needs the corresponding matrix-by-vector products Matrix-by-vector products are computed by differentiated code (from the state residual evaluation routine, stateres(psi,c,w) ) An algebraic preconditioner (ex.: ILU) should be re-engineered. 19 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
First Derivatives: basic Tangent algorithm Solve Ψ(c,W) = 0 Solve Ψ W θ = Ψ c ċ solve_matrixfree_tan Compute J W θ + J W c 20 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Non-differentiated matrix-free iterative solver(2) Example: GMRES communication for Ax = b Enter in GMRES an initial solution x 0 and a right-hand side b GMRES needs matvec: v Av GMRES needs precon: v M 1 v. For direct state Gateaux derivative ( ) Ψ b = ċ c ( ) Ψ Av = v W we use direct mode: stateres_dw_d stateres_dw_d( psi,( psid), c, W x w, wd ) Ψ Ψ x W Preconditioner is the one for state iterative solution. 21 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
First-order derivative: Tangent Mode Tangent differentiate stateres with c and w as input stateres( psi, c, w ) stateres_d( psi,psid, c, cd ċ, W Ẇ w, wd ) Ψ Ψ Input variables: c = c, w = W, cd = ċ, wd = Ẇ ( Ψ Output variables: psi = Ψ, psid = Ψ = c ) ċ+ ( ) Ψ Ẇ W Converging the matrix-free solver produces the perturbative state solution Ẇ sol. 22 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
First-order derivative: Tangent Mode Tangent differentiate func with c and w as input func( j, c, w ) func_d( j J,jḋ J Ẇ sol, c ċ, cd, W w, wd ) Input variables: c = c, w = W, cd = ċ, wd = Ẇ sol ( ) ( ) J J Output variables: j = J, jd = J = ċ+ Ẇ sol c W 23 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
First Derivative: Reverse algorithm Solve Ψ(c,W) = 0 Solve ( ) Ψ ( ) J Π = W W solve_matrixfree_rev ( ) dj = dc ( ) J c ( ) Ψ Π c 24 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Non-differentiated matrix-free iterative solver(3) Example: GMRES communication for Ax = b Enter in GMRES an initial solution x 0 and a right-hand side b GMRES needs matvec: v Av GMRES needs precon: v M 1 v. For reverse adjoint derivation ( ) J b = W ( ) Ψ Av = v W we use backward mode: stateres_dw_b x stateres_dw_b( psi, psib, c, W w, Ψ wb) x) ( Ψ W Preconditioner is the transpose of the one for state iterative solution. 25 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
First-order derivative: Reverse Mode func( j, c, w ) J func_b( j, jb, c, J,cb c W w,wb W ) Input variables: c = c, w = W, jb = J Output variables: j = J, ( ) J cb = c = J c ( ) J wb = W = J W 26 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
First-order derivative: Reverse Mode stateres( psi, c, w ) Ψ stateres_b( psi, psib, c, Ψ,cb c W w,wb W ) Input variables: c = c, w = W, psib = Ψ Output variables: psi = Ψ, ( ) Ψ cb = c = Ψ c ( ) Ψ wb = W = Ψ W 27 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Second derivative: Tangent-on-Tangent approach d 2 j dc i dc k = Π D 2 i,kψ+d 2 i,kj with Π solution of the adjoint system ( ) Ψ Π = W ( ) J W and D 2 i,k J = ( ) J c c e i e k + ( ) J W c e i θ k + ( J W c e k D 2 i,kψ = ( ) Ψ c c e i e k + ( ) Ψ W c e i θ k + W where θ k = dw dc k is the solution of the system Ψ dw = Ψ W dc k c e k ( Ψ c e k ) θ i + W ) θ i + W ( ) J W θ i ( Ψ W θ i θ k ) θ k 28 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Second Derivatives: basic ToT algorithm Solve Ψ(c,W) = 0 Solve ( ) Ψ ( ) J Π = W W solve_matrixfree_adj i = 1,...,n Initialize e i i = 1,...,n k = i,...,n Solve Ψ W θ i = Ψ c e i solve_matrixfree_tan func_d_d Compute D 2 i,k J Compute D2 i,k Ψ stateres_d_d Compute d 2 j dc i dc k = D 2 i,kj Π D 2 i,kψ 29 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Second derivative: Tangent-on-Tangent Mode stateres_d( psi,psid, c,cd, w,wd ) stateres_d_d( psi,psid Ψ Ψ,psidd Ψ e k θ k, c, cd0, e i cd, W w, wd0, θ i wd ) Input variables: c = c, cd = e i, w = W, wd = θ i, cd0 = e k, wd0 = θ ( ) ( ) k Ψ Ψ Output variables: psi = Ψ, psid = Ψ = e i + θ i c W psidd = ( ) Ψ c c e i e k + ( ) Ψ W c e i θ k + ( ) Ψ W c e k θ i + ( ) Ψ W W θ i θ k = D 2 i,kψ 30 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Second derivative: Tangent-on-Tangent Mode func_d( j,jd, c,cd, w,wd ) func_d_d( j J,jḋ J,jdd J e k θ k, c, cd0, e i cd, W w, wd0, θ i wd ) Input variables: c = c, cd = e i, w = W, wd = θ i, cd0 = e k, wd0 = θ ( ) ( ) k J J Output variables: j = J, jd = J = e i + θ i c W jdd = ( ) J c c e i e k + ( ) J W c e i θ k + ( ) J W c e k θ i + ( ) J W W θ i θ k = D 2 i,kj 31 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Second derivative: Tangent-on-Reverse c i ( ) j = c ( 2 j c 2 c ) e i = c [( Ψ c ( ) J e i + ( ) J θ i + c W c [( ) Ψ Π] θ i c ) Π] e i W with Π solution of the adjoint system ( ) Ψ Π = W ( ) J W ( ) Ψ λ i c θ i and λ i solution of the systems Ψ W θ i = Ψ c e i ( ) Ψ λ i = ( ) J e i + ( ) J θ i + W c W W W [( ) Ψ Π] e i [( ) Ψ Π] θ i c W W W 32 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Second derivative: Tangent-on-Reverse c i ( ) j = c ( 2 ) j c 2 e i = J ci + ) λ i Ψ ci ( Ψ c with Π solution of the adjoint system ( ) Ψ ( ) J Π = W W θ i and λ i solution of the systems ( Ψ W Ψ W θ i = Ψ c e i ) λ i = J W i+ Ψ W i 33 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Second Derivatives: basic ToR algorithm Solve Ψ(c,W) = 0 Solve ( ) Ψ ( ) J Π = W W solve_matrixfree_adj i = 1,...,n Initialize e i Solve Ψ W θ i = Ψ c e i solve_matrixfree_tan func_b_d Compute J c, J W Compute Ψ c, Ψ W stateres_b_d Solve ( ) Ψ λ i = J W Ψ W W solve_matrixfree_adj Compute c i ( ) j = J c Ψ c c ( ) Ψ λ i c stateres_dc_b 34 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Full Hessian: ToT vs. ToR Common part ( ) Ψ ( ) J Adjoint state Π = W W n linear system to solve: Ψ dw = Ψ W dc i c e i Specific part We assume unitary cost for a single residual evaluation Ψ(c,W) ToT: computation of D 2 ik Ψ and n(n+1) D2 ikj = αt 2 (1 < α T < 4) 2 ( ) Ψ ToR: n (adjoint) linear system to solve: λ i = J W Ψ W W = n(n iter + α T )α R (1 < α R 5) Use ToT when n < 2α R(n iter + α T ) α 2 T 35 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Cost for the full Hessian with different strategies 3e+06 2.5e+06 Cost for evaluating the full Hessian with different strategies ToT approach ToR approach ToT + ToR approach 2e+06 Runtime cost 1.5e+06 1e+06 500000 0 0 100 200 300 400 500 600 700 800 900 1000 n Figure: We assumed n iter,r = n iter,t = 300 for the number of iterations needed to solve the linear systems (tangent and adjoint) and α T = 2, α R = 4 for the overhead associated with the Tangent- and Reverse-differentiation. 36 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
4. Application to a 3D-Euler software Numerics: Upwind vertex-centered finite-volume Implicit pseudo-time advancing with first-order Jacobian F77 37 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Building the reduced quadratic model Nonlinear simulations Taylor 1st order (α=2.0, M=3) Drag Drag 0.09 0.09 0.09 0.085 0.08 0.075 0.07 0.065 0.06 0.055 0.05 0.085 0.08 0.075 0.07 0.065 0.06 0.055 0.05 0.045 0.09 0.085 0.08 0.075 0.07 0.065 0.06 0.055 0.05 0.085 0.08 0.075 0.07 0.065 0.06 0.055 0.05 0.045 0.045 0.045 2.4 2.4 8 2 Mach 4 6 8 1.6 1.8 2 2.2 Angle of attack 8 2 Mach 4 6 8 1.6 1.8 2 2.2 Angle of attack Taylor 2nd order (α=2.0, M=3) Relative Difference: Nonlinear vs. Taylor 2nd order Drag Relative difference (%) 0.09 3 0.09 0.085 0.08 0.075 0.07 0.065 0.06 0.055 0.05 0.085 0.08 0.075 0.07 0.065 0.06 0.055 0.05 0.045 3 2 1 0-1 -2 2 1 0-1 -2-3 0.045-3 2.4 2.4 8 2 Mach 4 6 8 1.6 1.8 2 2.2 Angle of attack 8 2 Mach 4 6 8 1.6 1.8 2 2.2 Angle of attack 38 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
3D Euler flow around SSBJ (1) 39 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
3D Euler flow around SSBJ (2): sensitivities of the drag coefficient Taylor 1st order (α=3.0, M=) Taylor 2nd order (α=3.0, M=) Drag Drag 5 5 0.65 0.6 0.55 0.5 5 5 0.65 0.6 0.55 0.5 0.45 5 5 0.65 0.6 0.55 0.5 5 5 0.65 0.6 0.55 0.5 0.45 0.45 0.45 3.4 3.4 3.2 3.2 6 8 Mach 2 4 2.6 2.8 3 Angle of attack 6 8 Mach 2 4 2.6 2.8 3 Angle of attack Taylor 1st order (α=3.0, M=) Taylor 2nd order (α=3.0, M=) Drag Drag 5 5 0.65 0.6 0.55 0.5 5 5 0.65 0.6 0.55 0.5 0.45 5 5 0.65 0.6 0.55 0.5 5 5 0.65 0.6 0.55 0.5 0.45 0.45 0.45 3.4 3.4 3.2 3.2 6 8 Mach 2 4 2.6 2.8 3 Angle of attack 6 8 Mach 2 4 2.6 2.8 3 Angle of attack 40 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
3D Euler flow around SSBJ (2): sensitivities of the drag coefficient Taylor 1st order (α=3.0, M=1.0) Taylor 2nd order (α=3.0, M=1.0) Drag Drag 5 5 0.65 0.6 0.55 0.5 5 5 0.65 0.6 0.55 0.5 0.45 5 5 0.65 0.6 0.55 0.5 5 5 0.65 0.6 0.55 0.5 0.45 0.45 0.45 3.4 3.4 3.2 3.2 6 8 Mach 1 1.02 1.04 2.6 2.8 3 Angle of attack 6 8 Mach 1 1.02 1.04 2.6 2.8 3 Angle of attack Taylor 1st order (α=3.0, M=1.1) Taylor 2nd order (α=3.0, M=1.1) Drag Drag 5 5 5 5 5 5 5 5 5 5 5 5 3.4 3.4 3.2 3.2 1.06 1.08 Mach 1.1 1.12 1.14 2.6 2.8 3 Angle of attack 1.06 1.08 Mach 1.1 1.12 1.14 2.6 2.8 3 Angle of attack 41 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
3D Euler flow around SSBJ (2): sensitivities of the drag coefficient Taylor 1st order (α=3.0, M=1.2) Taylor 2nd order (α=3.0, M=1.2) Drag Drag 5 5 5 5 5 5 5 5 5 5 5 5 1.16 1.18 Mach 1.2 1.22 1.24 2.6 2.8 3 3.2 3.4 Angle of attack 1.16 1.18 Mach 1.2 1.22 1.24 2.6 2.8 3 3.2 3.4 Angle of attack Taylor 1st order (α=3.0, M=1.4) Taylor 2nd order (α=3.0, M=1.4) Drag Drag 5 5 5 5 5 5 5 5 5 5 5 5 1.36 1.38 Mach 1.4 1.42 1.44 2.6 2.8 3 3.2 3.4 Angle of attack 1.36 1.38 Mach 1.4 1.42 1.44 2.6 2.8 3 3.2 3.4 Angle of attack 42 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
3D Euler flow around SSBJ (2): sensitivities of the drag coefficient Taylor 1st order (α=3.0, M=1.6) Taylor 2nd order (α=3.0, M=1.6) Drag Drag 5 5 5 5 5 5 5 5 5 5 5 5 1.56 1.58 Mach 1.6 1.62 1.64 2.6 2.8 3 3.2 3.4 Angle of attack 1.56 1.58 Mach 1.6 1.62 1.64 2.6 2.8 3 3.2 3.4 Angle of attack 43 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Comparing the reduced quadratic model with metamodels ( ) Metamodel Points Expectation Relative error Variance Relative error Radial Basis Function 8 6.85510 3 0.029% 1.56210 7 0.579% Kriging 8 6.85310 3 0.058% 1.55110 7 0.128% AD: First-order 6.83010 3 0.393% 1.54710 7 0.386% AD: Second-order 6.86010 3 0.043% 1.55810 7 0.321% AD steps Memory in Mb Flow solver First-order Jacobian + Block-diag. Precond. + work vectors 86+10+35 First derivatives ILU(1) Precond. + GMRES(200) + differentiated vectors 170+250+120 Second derivatives ILU(1) Precond. + GMRES(200) + differentiated vectors 170+250+240 AD: CPU time in second RBF or KRG: CPU time in second Flow solver 403 403*8 Gradient 255 Hessian 278 Total 936 3224 ( ) M. Martinelli, R. Duvigneau, On the use of second-order derivative and metamodel-based Monte-Carlo for uncertainty estimation in aerodynamics, Computers and Fluids, Vol 37, No 6, 2010. 44 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application
Concluding remarks Synthesis: a non-black-box AD application Optimum use of modern iterative solvers. Avoid managing useless storage or recomputing in backward mode. First and second derivative architecture well identified. Validated by experiments. Future works Very large instationary state systems. Automated treatment of mpi. Functional involved in robust optimisation and their gradients. 45 Belme, Martinelli, Hascoët, Pascual, Dervieux Sensitivityanalysis by adjoint Automatic Differentiationand Application