Dynamically data-driven morphing of reduced order models and the prediction of transients

Similar documents
CS281 Section 4: Factor Analysis and PCA

Reduced-order models for flow control: balanced models and Koopman modes

OPTIMAL CONTROL AND ESTIMATION

Machine Learning: Basis and Wavelet 김화평 (CSE ) Medical Image computing lab 서진근교수연구실 Haar DWT in 2 levels

ECE 602 Exam 2 Solutions, 3/23/2011.

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Section 5.4 (Systems of Linear Differential Equation); 9.5 Eigenvalues and Eigenvectors, cont d

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

4 Second-Order Systems

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Statistical Machine Learning

Problem set 7 Math 207A, Fall 2011 Solutions

1 Principal Components Analysis

Unsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent

Video 6.1 Vijay Kumar and Ani Hsieh

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

Liquid-Rocket Transverse Triggered Combustion Instability: Deterministic and Stochastic Analyses

Math 1553, Introduction to Linear Algebra

Preferred spatio-temporal patterns as non-equilibrium currents

Reconstruction. Reading for this lecture: Lecture Notes.

Principal Component Analysis

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Neural Network Training

Characterization of the stability boundary of nonlinear autonomous dynamical systems in the presence of a saddle-node equilibrium point of type 0

Stabilization of Unsteady Flows by Reduced-Order Control with Optimally Time-Dependent Modes

Immediate Reward Reinforcement Learning for Projective Kernel Methods

Module 6 : Solving Ordinary Differential Equations - Initial Value Problems (ODE-IVPs) Section 3 : Analytical Solutions of Linear ODE-IVPs

Lesson 4: Non-fading Memory Nonlinearities

Nonlinear Dimensionality Reduction. Jose A. Costa

Overview of sparse system identification

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

B5.6 Nonlinear Systems

Lecture 9. Time series prediction

LECTURE 22: SWARM INTELLIGENCE 3 / CLASSICAL OPTIMIZATION

CSC321 Lecture 20: Autoencoders

Construction of Lyapunov functions by validated computation

Dynamical Systems. 1.0 Ordinary Differential Equations. 2.0 Dynamical Systems

Ensemble square-root filters

SUPPLEMENTARY INFORMATION

Phase Plane Analysis

Autonomous Navigation for Flying Robots

20 Unsupervised Learning and Principal Components Analysis (PCA)

Discriminative Direction for Kernel Classifiers

Stability lectures. Stability of Linear Systems. Stability of Linear Systems. Stability of Continuous Systems. EECE 571M/491M, Spring 2008 Lecture 5

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Lecture 7: Con3nuous Latent Variable Models

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Maximum variance formulation

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Graphical Models

Dimension Reduction. David M. Blei. April 23, 2012

Nonlinear Methods. Data often lies on or near a nonlinear low-dimensional curve aka manifold.

Factor Analysis and Kalman Filtering (11/2/04)

Video 8.1 Vijay Kumar. Property of University of Pennsylvania, Vijay Kumar

Control of linear instabilities by dynamically consistent order reduction on optimally time-dependent modes

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Introduction to Machine Learning

CENTER MANIFOLD AND NORMAL FORM THEORIES

Short Course in Quantum Information Lecture 2

Traveling planetary-scale Rossby waves in the winter stratosphere: The role of tropospheric baroclinic instability

Linear Models for Regression CS534

6.2 Brief review of fundamental concepts about chaotic systems

STA 414/2104: Lecture 8

STA 414/2104: Machine Learning

Sparse, stable gene regulatory network recovery via convex optimization

Nonlinear Dimensionality Reduction

Linear identification of nonlinear systems: A lifting technique based on the Koopman operator

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA

B5.6 Nonlinear Systems

STA 414/2104: Lecture 8

Machine Learning. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395

ESANN'2001 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 2001, D-Facto public., ISBN ,

Alignment and Analysis of Proteomics Data using Square Root Slope Function Framework

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

At-A-Glance SIAM Events Mobile App

At A Glance. UQ16 Mobile App.

Control Theory in Physics and other Fields of Science

Utilizing Adjoint-Based Techniques to Improve the Accuracy and Reliability in Uncertainty Quantification

Revision of TR-09-25: A Hybrid Variational/Ensemble Filter Approach to Data Assimilation

THE USE OF A CALCULATOR, CELL PHONE, OR ANY OTHER ELEC- TRONIC DEVICE IS NOT PERMITTED DURING THIS EXAMINATION.

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization

The Important State Coordinates of a Nonlinear System

A Constraint Generation Approach to Learning Stable Linear Dynamical Systems

CME 345: MODEL REDUCTION - Projection-Based Model Order Reduction

Model Reduction, Centering, and the Karhunen-Loeve Expansion

Contents. 1 State-Space Linear Systems 5. 2 Linearization Causality, Time Invariance, and Linearity 31

Final. for Math 308, Winter This exam contains 7 questions for a total of 100 points in 15 pages.

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis

Non-linear Dimensionality Reduction

Communication Theory II

L26: Advanced dimensionality reduction

Principal Component Analysis

PCA, Kernel PCA, ICA

Restricted Boltzmann Machines for Collaborative Filtering

I. Multiple Choice Questions (Answer any eight)

Efficient Data Assimilation for Spatiotemporal Chaos: a Local Ensemble Transform Kalman Filter

Linear Regression. Aarti Singh. Machine Learning / Sept 27, 2010

PREDICTING SOLAR GENERATION FROM WEATHER FORECASTS. Chenlin Wu Yuhan Lou

Course 495: Advanced Statistical Machine Learning/Pattern Recognition

Transcription:

STOCHASTIC ANALYSIS AND NONLINEAR DYNAMICS Dynamically data-driven morphing of reduced order models and the prediction of transients Joint NSF/AFOSR EAGER on Dynamic Data Systems Themis Sapsis Massachusetts Institute of Technology Department of Mechanical Engineering ABS Career Development Assistant Professor Yannis Kevrekidis Princeton University Department of Chemical Engineering Smith Professor of Engineering 1

Complex systems with inherently transient dynamics Intermittent phenomena in CFD/GFD Objective Dynamically, data-driven prediction and filtering Challenges Extreme events in nonlinear waves Very high dimensionality (both physical and intrinsic) Inherently time-dependent features (rare events, non-stationary statistics) Model error (neglected dynamics, unknown parameters) Transient responses in networks Data (observation errors, sparse data) Approach Physics-constrained, data-driven modeling

Optimally time dependent modes Develop an approach that will adaptively select the modes associated with transient instabilities Setup Dynamical system: Linearized dynamics around a trajectory: We introduce the following minimization principle: Optimally time dependent modes: H. Babaee & T. Sapsis, A minimization principle for the description of modes associated with finite-time instabilities, Proceedings of the Royal Society A (2016) In Press.

Optimally time dependent modes Theorem 1. The minimization principle defined within the basis elements that satisfy the orthonormality constraint is equivalent with the set of evolution equations Theorem 2. Let L be a steady and diagonalizable operator that represents the linearization of an autonomous dynamical system. Then i) The OTD modes equations have equilibrium states that consist of all the r- dimensional subspaces in the span of r distinct eigenvectors of L. ii) From all the equilibrium states there is only one that is a stable solution for equation. This is given by the subspace spanned by the eigenvectors of L associated with the r eigenvalues having the largest real part. H. Babaee & T. Sapsis, A minimization principle for the description of modes associated with finite-time instabilities, Proceedings of the Royal Society A (2016) In Press.

Optimally time dependent modes A simple 3D example with 1 OTD mode H. Babaee & T. Sapsis, A minimization principle for the description of modes associated with finite-time instabilities, Proceedings of the Royal Society A (2016) In Press.

Physics-constrained, data-driven modeling 3D unstable jet in cross flow 1st OTD mode during the initial transient 4 OTD modes in the statistical steady state Growth rate of the OTD modes

Dynamic Data-Driven reduced-order dynamics OTD basis adaptively captures the most important directions of phase space. Simple Galerkin projection of the governing equations contains important truncation error and model error. Can we utilize available data streams to stochastically reconstruct the reducedorder vector field within the reduced-order OTD subspace? Approach Given the OTD basis u i (t), i = 1,,r up to the current time instant t we project the available data points z #, z #, j = 1,, D. This gives the projected data points for the evolution vector within the OTD subspace: y # = z #, u and y # = z #, u We then use Gaussian Process (GP) Regression to reconstruct the reduced-order vector field: f y = f y; u Quantifies the mean vector field as well as the truncation & model error

Dynamic Data-Driven reduced-order dynamics Demonstration over a fixed-in-time subspace Demonstration over a fixed-in-time subspace Z. Y. Wan & T. Sapsis, Reduced-space Gaussian Process Regression Forecast for Nonlinear Dynamical Systems, 2016 (Submitted).

Summary Dynamical Equations System state OTD equations Data Projection in the OTD subspace Update of the system state only along OTD directions Machine-learn the reduced-order dynamics DO subspace aligns with the mostunstable directions in the neighborhood of the solution Probabilistic Estimate for the current state 9 Machine learned manifold H. Babaee & T. Sapsis, A minimization principle for the description of modes associated with finite-time instabilities, Proceedings of the Royal Society A (2016) In Press. Z. Y. Wan & T. Sapsis, Reduced-space Gaussian Process Regression Forecast for Nonlinear Dynamical Systems, 2016 (Submitted).