Luiz F. O. Chamon, George J. Pappas and Alejandro Ribeiro. CDC 2017 December 12 th, 2017

Size: px
Start display at page:

Download "Luiz F. O. Chamon, George J. Pappas and Alejandro Ribeiro. CDC 2017 December 12 th, 2017"

Transcription

1 The Mean Square Error in Kalman Filtering Sensor Selection is Approximately Supermodular Luiz F. O. Chamon, George J. Pappas and Alejandro Ribeiro CDC 2017 December 12 th, 2017 Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 1

2 Goal Why does greedy sensor selection for Kalman filtering works when it shouldn t? Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 2

3 KFSS Problem (KFSS) Select up to s system outputs to estimate its internal states. Why the MSE? KF minimize S O subject to MSE(S) S s NP-hard [Natarajan 95, Zhang 17, Ye 17] Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 3

4 Greedy KFSS Definition Select sensors/outputs one at a time by choosing the one that most improves estimation at each step. function Greedy(q) end G 0 = {} for j = 1,..., q u = argmin MSE (G j 1 {v}) v O\G j 1 G j = G j 1 {u} end Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 4

5 Greedy KFSS Definition Select sensors/outputs one at a time by choosing the one that most improves estimation at each step. function Greedy(q) end G 0 = {} for j = 1,..., q u = argmin MSE (G j 1 {v}) v O\G j 1 G j = G j 1 {u} end Low complexity Sequential Near-optimal for supermodular objectives Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 4

6 KFSS Problem (KFSS) Select up to s system outputs to estimate its internal states. Why the MSE? KF minimize S O subject to MSE(S) S s NP-hard [Natarajan 95, Zhang 17, Ye 17] Estimation MSE is not supermodular [Tzoumas 16, Olshevsky 16, Singh 17, Zhang 17] Use a supermodular surrogate (e.g., log det) [Joshi 09, Shamaiah 10, Tzoumas 16] Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 5

7 Greedy KFSS Count Smoothing Relative suboptimality Filtering Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular

8 Greedy KFSS Count Smoothing Relative suboptimality Greedy KFSS is near-optimal Filtering Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular

9 Kalman filtering sensor selection (Approximate) supermodularity Near-optimality of greedy KFSS Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 7

10 Kalman filtering x k+1 = F x k + w k y k = Hx k + v k w k N (0, σ 2 wi) v k N (0, σ 2 vi) x 0 N ( x 0, Π 0 ) Problem (Filtering) Estimate x k based on outputs up to time k, i.e., Solution (Kalman filter) ˆx k = E [x k {y j } j k ] ˆx k = F ˆx k 1 + K k [y k HF ˆx k 1 ] Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 8

11 Kalman filtering x k+1 = F x k + w k y k = Hx k + v k w k N (0, σ 2 wi) v k N (0, σ 2 vi) x 0 N ( x 0, Π 0 ) Problem (Filtering) Estimate x k based on outputs in S O up to time k, i.e., Solution (Kalman filter) ˆx k = E [x k {(y j ) S } j k ] ˆx k (S) = F ˆx k 1 (S) + K k [(y k ) S H S F ˆx k 1 (S)] Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 8

12 KFSS Problem (KF sensor selection) Find S O, S s, that minimizes the estimation MSE minimize S s m 1 θ j MSE l+j (S) j=0 Myopic sensor selection: m = 1 Final estimation MSE: θ j = 0 for j < m 1 and θ m 1 = 1 Exponentially weighted error: θ j = ρ m 1 j, ρ < 1 Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 9

13 KFSS Problem (KF sensor selection) Find S O, S s, that minimizes the estimation MSE where minimize S s m 1 θ j Tr [P l+j (S)] j=0 P k (S) = F P k 1 (S)F T + σwi 2 }{{} P k k 1 +σv 2 h i h T i }{{} i S i-th sensor Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 9 1

14 Kalman filtering sensor selection (Approximate) supermodularity Near-optimality of greedy KFSS Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 10

15 Supermodularity Definition (Supermodularity) For A B O and u O \ B f (A) f (A {u}) f (B) f (B {u}) diminishing returns Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 11

16 Greedy supermodular minimization Theorem ([NWF 78]) Let S be the optimal solution of the problem minimize S s f (S) and G be its greedy solution. If f is (i) monotone decreasing and (ii) supermodular, then f(g) f(s ) f( ) f(s ) e Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 12

17 Greedy supermodular minimization Theorem ([NWF 78]) If f is (i) monotone decreasing and (ii) supermodular, then f(g) f(s ) f( ) f(s ) e Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 13

18 α-supermodularity Definition (Supermodularity) For A B O and u O \ B f (A {u}) f (A) f (B {u}) f (B) Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 14

19 α-supermodularity Definition (α-supermodularity) For A B O, u O \ B, and α 0 [ ] f (A {u}) f (A) α f (B {u}) f (B) If α 1: f is supermodular If α < 1: f is approximately supermodular Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 14

20 Greedy α-supermodular minimization Theorem ([Chamon-Ribeiro 16]) Let S be the solution of the problem minimize S s f (S) and G q be the q-th iteration of a greedy solution. If f is (i) monotone decreasing and (ii) α-supermodular, then f(g q ) f(s ) f( ) f(s ) e αq/s. Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 15

21 Greedy α-supermodular minimization Theorem ([Chamon-Ribeiro 16]) If f is (i) monotone decreasing and (ii) α-supermodular, then f(g q ) f(s ) f( ) f(s ) e αq/s. For q = s and α = 1, we recover the classical e 1 result If α < 1, then e 1 is recovered for q = α 1 s Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 16

22 α for KFSS What is α for KFSS? Combinatorial problem α = min A B O u O\B MSE (A) MSE (A {u}) MSE (B) MSE (B {u}) Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 17

23 α for KFSS Theorem ([Chamon-Pappas-Ribeiro 17]) The objective of KFSS is α-supermodular with α min l k l+m 1 λ min [P k (O)] [ ] λ max Pk k 1 P k (O) = ( P k k 1 + σ 2 H T H ) 1 P k k 1 = F P k 1 F T + σ 2 wi σ 2 v σ 2 w and small κ(f ) α 1 Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 18 v

24 α for KFSS n = 100 states and H = I (db) Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 19

25 Kalman filtering sensor selection (Approximate) supermodularity Near-optimality of greedy KFSS Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 20

26 α for KFSS n = 100 states and H = I Suboptimality certificate (db) Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 21

27 Conclusion Why does greedy KFSS works so well? The MSE in KFSS is not supermodular, but almost Greedy KFSS is efficient and has a guaranteed near-optimal performance Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 22

28 The Mean Square Error in Kalman Filtering Sensor Selection is Approximately Supermodular Luiz F. O. Chamon, George J. Pappas and Alejandro Ribeiro More details: luizf Chamon, Pappas, Ribeiro The MSE in KF Sensor Selection is Approximately Supermodular 23

The Mean Square Error in Kalman Filtering Sensor Selection is Approximately Supermodular

The Mean Square Error in Kalman Filtering Sensor Selection is Approximately Supermodular The Mean Square Error in Kalman Filtering Sensor Selection is Approximately Supermodular Luiz F. O. Chamon, George J. Pappas, and Alejandro Ribeiro Abstract This work considers the problem of selecting

More information

NEAR-OPTIMALITY OF GREEDY SET SELECTION IN THE SAMPLING OF GRAPH SIGNALS. Luiz F. O. Chamon and Alejandro Ribeiro

NEAR-OPTIMALITY OF GREEDY SET SELECTION IN THE SAMPLING OF GRAPH SIGNALS. Luiz F. O. Chamon and Alejandro Ribeiro NEAR-OPTIMALITY OF GREEDY SET SELECTION IN THE SAMPLING OF GRAPH SIGNALS Luiz F. O. Chamon and Alejandro Ribeiro Department of Electrical and Systems Engineering University of Pennsylvania e-mail: luizf@seas.upenn.edu,

More information

Anytime Planning for Decentralized Multi-Robot Active Information Gathering

Anytime Planning for Decentralized Multi-Robot Active Information Gathering Anytime Planning for Decentralized Multi-Robot Active Information Gathering Brent Schlotfeldt 1 Dinesh Thakur 1 Nikolay Atanasov 2 Vijay Kumar 1 George Pappas 1 1 GRASP Laboratory University of Pennsylvania

More information

V. Tzoumas, A. Jadbabaie, G. J. Pappas

V. Tzoumas, A. Jadbabaie, G. J. Pappas 2016 American Control Conference ACC Boston Marriott Copley Place July 6-8, 2016. Boston, MA, USA Sensor Placement for Optimal Kalman Filtering: Fundamental imits, Submodularity, and Algorithms V. Tzoumas,

More information

Supermodular Mean Squared Error Minimization for Sensor Scheduling in Optimal Kalman Filtering

Supermodular Mean Squared Error Minimization for Sensor Scheduling in Optimal Kalman Filtering APPEARED IN Proc. American Control Conference, 2017 Supermodular Mean Squared Error Minimization for Sensor Scheduling in Optimal Kalman Filtering Prince Singh Min Chen Luca Carlone Sertac Karaman Emilio

More information

UNIVERSAL BOUNDS FOR THE SAMPLING OF GRAPH SIGNALS. Luiz F. O. Chamon and Alejandro Ribeiro

UNIVERSAL BOUNDS FOR THE SAMPLING OF GRAPH SIGNALS. Luiz F. O. Chamon and Alejandro Ribeiro UNIVERSAL BOUNDS FOR THE SAMPLING OF GRAPH SIGNALS Luiz F. O. Chamon and Alejandro Ribeiro Electrical and Systems Engineering University of Pennsylvania e-mail: luizf@seas.upenn.edu, aribeiro@seas.upenn.edu

More information

Module 8 Linear Programming. CS 886 Sequential Decision Making and Reinforcement Learning University of Waterloo

Module 8 Linear Programming. CS 886 Sequential Decision Making and Reinforcement Learning University of Waterloo Module 8 Linear Programming CS 886 Sequential Decision Making and Reinforcement Learning University of Waterloo Policy Optimization Value and policy iteration Iterative algorithms that implicitly solve

More information

Resilient Submodular Maximization For Control And Sensing

Resilient Submodular Maximization For Control And Sensing University of Pennsylvania ScholarlyCommons Publicly Accessible Penn Dissertations 2018 Resilient Submodular Maximization For Control And Sensing Vasileios Tzoumas University of Pennsylvania, vdtzoumas@gmail.com

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

City, University of London Institutional Repository

City, University of London Institutional Repository City Research Online City, University of London Institutional Repository Citation: Zhao, S., Shmaliy, Y. S., Khan, S. & Liu, F. (2015. Improving state estimates over finite data using optimal FIR filtering

More information

Vehicle Motion Estimation Using an Infrared Camera an Industrial Paper

Vehicle Motion Estimation Using an Infrared Camera an Industrial Paper Vehicle Motion Estimation Using an Infrared Camera an Industrial Paper Emil Nilsson, Christian Lundquist +, Thomas B. Schön +, David Forslund and Jacob Roll * Autoliv Electronics AB, Linköping, Sweden.

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Network Optimization with Heuristic Rational Agents

Network Optimization with Heuristic Rational Agents Network Optimization with Heuristic Rational Agents Ceyhun Eksin and Alejandro Ribeiro Department of Electrical and Systems Engineering, University of Pennsylvania {ceksin, aribeiro}@seas.upenn.edu Abstract

More information

Online Optimization in Dynamic Environments: Improved Regret Rates for Strongly Convex Problems

Online Optimization in Dynamic Environments: Improved Regret Rates for Strongly Convex Problems 216 IEEE 55th Conference on Decision and Control (CDC) ARIA Resort & Casino December 12-14, 216, Las Vegas, USA Online Optimization in Dynamic Environments: Improved Regret Rates for Strongly Convex Problems

More information

Submodular Functions Properties Algorithms Machine Learning

Submodular Functions Properties Algorithms Machine Learning Submodular Functions Properties Algorithms Machine Learning Rémi Gilleron Inria Lille - Nord Europe & LIFL & Univ Lille Jan. 12 revised Aug. 14 Rémi Gilleron (Mostrare) Submodular Functions Jan. 12 revised

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

An Online Algorithm for Maximizing Submodular Functions

An Online Algorithm for Maximizing Submodular Functions An Online Algorithm for Maximizing Submodular Functions Matthew Streeter December 20, 2007 CMU-CS-07-171 Daniel Golovin School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 This research

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Adaptive ensemble Kalman filtering of nonlinear systems

Adaptive ensemble Kalman filtering of nonlinear systems Adaptive ensemble Kalman filtering of nonlinear systems Tyrus Berry George Mason University June 12, 213 : Problem Setup We consider a system of the form: x k+1 = f (x k ) + ω k+1 ω N (, Q) y k+1 = h(x

More information

Efficient Information Planning in Graphical Models

Efficient Information Planning in Graphical Models Efficient Information Planning in Graphical Models computational complexity considerations John Fisher & Giorgos Papachristoudis, MIT VITALITE Annual Review 2013 September 9, 2013 J. Fisher (VITALITE Annual

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Submodularity in Machine Learning

Submodularity in Machine Learning Saifuddin Syed MLRG Summer 2016 1 / 39 What are submodular functions Outline 1 What are submodular functions Motivation Submodularity and Concavity Examples 2 Properties of submodular functions Submodularity

More information

Non-Myopic Target Tracking Strategies for Non-Linear Systems

Non-Myopic Target Tracking Strategies for Non-Linear Systems Non-Myopic Target Tracking Strategies for Non-Linear Systems Zhongshun Zhang and Pratap Tokekar Abstract We study the problem of devising a closed-loop strategy to control the position of a robot that

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

Statistics Homework #4

Statistics Homework #4 Statistics 910 1 Homework #4 Chapter 6, Shumway and Stoffer These are outlines of the solutions. If you would like to fill in other details, please come see me during office hours. 6.1 State-space representation

More information

Applications of Submodular Functions in Speech and NLP

Applications of Submodular Functions in Speech and NLP Applications of Submodular Functions in Speech and NLP Jeff Bilmes Department of Electrical Engineering University of Washington, Seattle http://ssli.ee.washington.edu/~bilmes June 27, 2011 J. Bilmes Applications

More information

Lecture 3: Functions of Symmetric Matrices

Lecture 3: Functions of Symmetric Matrices Lecture 3: Functions of Symmetric Matrices Yilin Mo July 2, 2015 1 Recap 1 Bayes Estimator: (a Initialization: (b Correction: f(x 0 Y 1 = f(x 0 f(x k Y k = αf(y k x k f(x k Y k 1, where ( 1 α = f(y k x

More information

Restless Bandit Index Policies for Solving Constrained Sequential Estimation Problems

Restless Bandit Index Policies for Solving Constrained Sequential Estimation Problems Restless Bandit Index Policies for Solving Constrained Sequential Estimation Problems Sofía S. Villar Postdoctoral Fellow Basque Center for Applied Mathemathics (BCAM) Lancaster University November 5th,

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,

More information

Submodularity in Input Node Selection for. Networked Systems

Submodularity in Input Node Selection for. Networked Systems Submodularity in Input Node Selection for Networked Systems arxiv:1605.09465v1 [cs.sy] 31 May 2016 Efficient Algorithms for Performance and Controllability Andrew Clark, Basel Alomair, Linda Bushnell,

More information

Prediction, filtering and smoothing using LSCR: State estimation algorithms with guaranteed confidence sets

Prediction, filtering and smoothing using LSCR: State estimation algorithms with guaranteed confidence sets 2 5th IEEE Conference on Decision and Control and European Control Conference (CDC-ECC) Orlando, FL, USA, December 2-5, 2 Prediction, filtering and smoothing using LSCR: State estimation algorithms with

More information

Submodular Functions, Optimization, and Applications to Machine Learning

Submodular Functions, Optimization, and Applications to Machine Learning Submodular Functions, Optimization, and Applications to Machine Learning Spring Quarter, Lecture 12 http://www.ee.washington.edu/people/faculty/bilmes/classes/ee596b_spring_2016/ Prof. Jeff Bilmes University

More information

EUSIPCO

EUSIPCO EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,

More information

Bayesian Models for Regularization in Optimization

Bayesian Models for Regularization in Optimization Bayesian Models for Regularization in Optimization Aleksandr Aravkin, UBC Bradley Bell, UW Alessandro Chiuso, Padova Michael Friedlander, UBC Gianluigi Pilloneto, Padova Jim Burke, UW MOPTA, Lehigh University,

More information

Gramian-based Reachability Metrics for Bilinear Networks

Gramian-based Reachability Metrics for Bilinear Networks Gramian-based Reachability Metrics for Bilinear Networks Yingbo Zhao, Jorge Cortés Department of Mechanical and Aerospace Engineering UC San Diego The 54th IEEE Conference on Decision and Control, Osaka,

More information

ANYTIME OPTIMAL DISTRIBUTED KALMAN FILTERING AND SMOOTHING. University of Minnesota, 200 Union Str. SE, Minneapolis, MN 55455, USA

ANYTIME OPTIMAL DISTRIBUTED KALMAN FILTERING AND SMOOTHING. University of Minnesota, 200 Union Str. SE, Minneapolis, MN 55455, USA ANYTIME OPTIMAL DISTRIBUTED KALMAN FILTERING AND SMOOTHING Ioannis D. Schizas, Georgios B. Giannakis, Stergios I. Roumeliotis and Alejandro Ribeiro University of Minnesota, 2 Union Str. SE, Minneapolis,

More information

TRANSIENT PERFORMANCE OF AN INCREMENTAL COMBINATION OF LMS FILTERS. Luiz F. O. Chamon and Cássio G. Lopes

TRANSIENT PERFORMANCE OF AN INCREMENTAL COMBINATION OF LMS FILTERS. Luiz F. O. Chamon and Cássio G. Lopes TRANSIENT PERFORMANCE OF AN INCREMENTAL COMBINATION OF LMS FILTERS Luiz F. O. Chamon and Cássio G. Lopes Signal Processing Laboratory Dept. of Electronic Systems Engineering, University of São Paulo Brazil

More information

Discrete Optimization in Machine Learning. Colorado Reed

Discrete Optimization in Machine Learning. Colorado Reed Discrete Optimization in Machine Learning Colorado Reed [ML-RCC] 31 Jan 2013 1 Acknowledgements Some slides/animations based on: Krause et al. tutorials: http://www.submodularity.org Pushmeet Kohli tutorial:

More information

Solving Corrupted Quadratic Equations, Provably

Solving Corrupted Quadratic Equations, Provably Solving Corrupted Quadratic Equations, Provably Yuejie Chi London Workshop on Sparse Signal Processing September 206 Acknowledgement Joint work with Yuanxin Li (OSU), Huishuai Zhuang (Syracuse) and Yingbin

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU

More information

The Simplex and Policy Iteration Methods are Strongly Polynomial for the Markov Decision Problem with Fixed Discount

The Simplex and Policy Iteration Methods are Strongly Polynomial for the Markov Decision Problem with Fixed Discount The Simplex and Policy Iteration Methods are Strongly Polynomial for the Markov Decision Problem with Fixed Discount Yinyu Ye Department of Management Science and Engineering and Institute of Computational

More information

Sparse Sensing for Statistical Inference

Sparse Sensing for Statistical Inference Sparse Sensing for Statistical Inference Model-driven and data-driven paradigms Geert Leus, Sundeep Chepuri, and Georg Kail ITA 2016, 04 Feb 2016 1/17 Power networks, grid analytics Health informatics

More information

a 1 a 2 a 3 a 4 v i c i c(a 1, a 3 ) = 3

a 1 a 2 a 3 a 4 v i c i c(a 1, a 3 ) = 3 AM 221: Advanced Optimization Spring 2016 Prof. Yaron Singer Lecture 17 March 30th 1 Overview In the previous lecture, we saw examples of combinatorial problems: the Maximal Matching problem and the Minimum

More information

Nonlinear Least Squares

Nonlinear Least Squares Nonlinear Least Squares Stephen Boyd EE103 Stanford University December 6, 2016 Outline Nonlinear equations and least squares Examples Levenberg-Marquardt algorithm Nonlinear least squares classification

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise I. Bilik 1 and J. Tabrikian 2 1 Dept. of Electrical and Computer Engineering, University

More information

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft 1 Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft K. Meier and A. Desai Abstract Using sensors that only measure the bearing angle and range of an aircraft, a Kalman filter is implemented

More information

Introduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p.

Introduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p. Preface p. xiii Acknowledgment p. xix Introduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p. 4 Bayes Decision p. 5

More information

Conditional Gradient (Frank-Wolfe) Method

Conditional Gradient (Frank-Wolfe) Method Conditional Gradient (Frank-Wolfe) Method Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 1 Outline Today: Conditional gradient method Convergence analysis Properties

More information

Symmetry and hardness of submodular maximization problems

Symmetry and hardness of submodular maximization problems Symmetry and hardness of submodular maximization problems Jan Vondrák 1 1 Department of Mathematics Princeton University Jan Vondrák (Princeton University) Symmetry and hardness 1 / 25 Submodularity Definition

More information

Robust Monte Carlo Methods for Sequential Planning and Decision Making

Robust Monte Carlo Methods for Sequential Planning and Decision Making Robust Monte Carlo Methods for Sequential Planning and Decision Making Sue Zheng, Jason Pacheco, & John Fisher Sensing, Learning, & Inference Group Computer Science & Artificial Intelligence Laboratory

More information

Minimal Input Selection for Robust Control

Minimal Input Selection for Robust Control Minimal Input Selection for Robust Control Zhipeng Liu, Yao Long, Andrew Clark, Phillip Lee, Linda Bushnell, Daniel Kirschen, and Radha Poovendran Abstract This paper studies the problem of selecting a

More information

On Submodularity and Controllability in Complex Dynamical Networks

On Submodularity and Controllability in Complex Dynamical Networks 1 On Submodularity and Controllability in Complex Dynamical Networks Tyler H. Summers, Fabrizio L. Cortesi, and John Lygeros arxiv:144.7665v2 [math.oc] 12 Jan 215 Abstract Controllability and observability

More information

Network Localization via Schatten Quasi-Norm Minimization

Network Localization via Schatten Quasi-Norm Minimization Network Localization via Schatten Quasi-Norm Minimization Anthony Man-Cho So Department of Systems Engineering & Engineering Management The Chinese University of Hong Kong (Joint Work with Senshan Ji Kam-Fung

More information

CS 322: (Social and Information) Network Analysis Jure Leskovec Stanford University

CS 322: (Social and Information) Network Analysis Jure Leskovec Stanford University CS 322: (Social and Inormation) Network Analysis Jure Leskovec Stanord University Initially some nodes S are active Each edge (a,b) has probability (weight) p ab b 0.4 0.4 0.2 a 0.4 0.2 0.4 g 0.2 Node

More information

Open Economy Macroeconomics: Theory, methods and applications

Open Economy Macroeconomics: Theory, methods and applications Open Economy Macroeconomics: Theory, methods and applications Lecture 4: The state space representation and the Kalman Filter Hernán D. Seoane UC3M January, 2016 Today s lecture State space representation

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: ony Jebara Kalman Filtering Linear Dynamical Systems and Kalman Filtering Structure from Motion Linear Dynamical Systems Audio: x=pitch y=acoustic waveform Vision: x=object

More information

9 Multi-Model State Estimation

9 Multi-Model State Estimation Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State

More information

Data Structures in Java

Data Structures in Java Data Structures in Java Lecture 21: Introduction to NP-Completeness 12/9/2015 Daniel Bauer Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 9 Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 2 Separable convex optimization a special case is min f(x)

More information

A Greedy Framework for First-Order Optimization

A Greedy Framework for First-Order Optimization A Greedy Framework for First-Order Optimization Jacob Steinhardt Department of Computer Science Stanford University Stanford, CA 94305 jsteinhardt@cs.stanford.edu Jonathan Huggins Department of EECS Massachusetts

More information

Restricted Strong Convexity Implies Weak Submodularity

Restricted Strong Convexity Implies Weak Submodularity Restricted Strong Convexity Implies Weak Submodularity Ethan R. Elenberg Rajiv Khanna Alexandros G. Dimakis Department of Electrical and Computer Engineering The University of Texas at Austin {elenberg,rajivak}@utexas.edu

More information

SIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS

SIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS ECE5550: Applied Kalman Filtering 9 1 SIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS 9.1: Parameters versus states Until now, we have assumed that the state-space model of the system

More information

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28 Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:

More information

Static Parameter Estimation using Kalman Filtering and Proximal Operators Vivak Patel

Static Parameter Estimation using Kalman Filtering and Proximal Operators Vivak Patel Static Parameter Estimation using Kalman Filtering and Proximal Operators Viva Patel Department of Statistics, University of Chicago December 2, 2015 Acnowledge Mihai Anitescu Senior Computational Mathematician

More information

Temporal-Difference Q-learning in Active Fault Diagnosis

Temporal-Difference Q-learning in Active Fault Diagnosis Temporal-Difference Q-learning in Active Fault Diagnosis Jan Škach 1 Ivo Punčochář 1 Frank L. Lewis 2 1 Identification and Decision Making Research Group (IDM) European Centre of Excellence - NTIS University

More information

Frank-Wolfe Method. Ryan Tibshirani Convex Optimization

Frank-Wolfe Method. Ryan Tibshirani Convex Optimization Frank-Wolfe Method Ryan Tibshirani Convex Optimization 10-725 Last time: ADMM For the problem min x,z f(x) + g(z) subject to Ax + Bz = c we form augmented Lagrangian (scaled form): L ρ (x, z, w) = f(x)

More information

Sub-modularity and Antenna Selection in MIMO systems

Sub-modularity and Antenna Selection in MIMO systems Sub-modularity and Antenna Selection in MIMO systems Rahul Vaze Harish Ganapathy Point-to-Point MIMO Channel 1 1 Tx H Rx N t N r Point-to-Point MIMO Channel 1 1 Tx H Rx N t N r Antenna Selection Transmit

More information

A Study of Covariances within Basic and Extended Kalman Filters

A Study of Covariances within Basic and Extended Kalman Filters A Study of Covariances within Basic and Extended Kalman Filters David Wheeler Kyle Ingersoll December 2, 2013 Abstract This paper explores the role of covariance in the context of Kalman filters. The underlying

More information

Efficiently Training Sum-Product Neural Networks using Forward Greedy Selection

Efficiently Training Sum-Product Neural Networks using Forward Greedy Selection Efficiently Training Sum-Product Neural Networks using Forward Greedy Selection Shai Shalev-Shwartz School of CS and Engineering, The Hebrew University of Jerusalem Greedy Algorithms, Frank-Wolfe and Friends

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

Event-triggered stabilization of linear systems under channel blackouts

Event-triggered stabilization of linear systems under channel blackouts Event-triggered stabilization of linear systems under channel blackouts Pavankumar Tallapragada, Massimo Franceschetti & Jorge Cortés Allerton Conference, 30 Sept. 2015 Acknowledgements: National Science

More information

Least Squares and Kalman Filtering Questions: me,

Least Squares and Kalman Filtering Questions:  me, Least Squares and Kalman Filtering Questions: Email me, namrata@ece.gatech.edu Least Squares and Kalman Filtering 1 Recall: Weighted Least Squares y = Hx + e Minimize Solution: J(x) = (y Hx) T W (y Hx)

More information

Algorithmic Stability and Generalization Christoph Lampert

Algorithmic Stability and Generalization Christoph Lampert Algorithmic Stability and Generalization Christoph Lampert November 28, 2018 1 / 32 IST Austria (Institute of Science and Technology Austria) institute for basic research opened in 2009 located in outskirts

More information

Distributed estimation in sensor networks

Distributed estimation in sensor networks in sensor networks A. Benavoli Dpt. di Sistemi e Informatica Università di Firenze, Italy. e-mail: benavoli@dsi.unifi.it Outline 1 An introduction to 2 3 An introduction to An introduction to In recent

More information

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter D. Richard Brown III Worcester Polytechnic Institute 09-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 09-Apr-2009 1 /

More information

A COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION

A COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION A COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION Theodoros Tsiligkaridis, Brian M Sadler and Alfred O Hero III, University of Michigan, EECS Dept and Dept Statistics,

More information

Robust Sparse Recovery via Non-Convex Optimization

Robust Sparse Recovery via Non-Convex Optimization Robust Sparse Recovery via Non-Convex Optimization Laming Chen and Yuantao Gu Department of Electronic Engineering, Tsinghua University Homepage: http://gu.ee.tsinghua.edu.cn/ Email: gyt@tsinghua.edu.cn

More information

Fundamentals of Data Assimila1on

Fundamentals of Data Assimila1on 014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University

More information

Efficient Sensor Network Planning Method. Using Approximate Potential Game

Efficient Sensor Network Planning Method. Using Approximate Potential Game Efficient Sensor Network Planning Method 1 Using Approximate Potential Game Su-Jin Lee, Young-Jin Park, and Han-Lim Choi, Member, IEEE arxiv:1707.00796v1 [cs.gt] 4 Jul 2017 Abstract This paper addresses

More information

Monotone Submodular Maximization over a Matroid

Monotone Submodular Maximization over a Matroid Monotone Submodular Maximization over a Matroid Yuval Filmus January 31, 2013 Abstract In this talk, we survey some recent results on monotone submodular maximization over a matroid. The survey does not

More information

Test 2 Review Math 1111 College Algebra

Test 2 Review Math 1111 College Algebra Test 2 Review Math 1111 College Algebra 1. Begin by graphing the standard quadratic function f(x) = x 2. Then use transformations of this graph to graph the given function. g(x) = x 2 + 2 *a. b. c. d.

More information

Optimized first-order minimization methods

Optimized first-order minimization methods Optimized first-order minimization methods Donghwan Kim & Jeffrey A. Fessler EECS Dept., BME Dept., Dept. of Radiology University of Michigan web.eecs.umich.edu/~fessler UM AIM Seminar 2014-10-03 1 Disclosure

More information

Dynamic Bandwidth Allocation for Target Tracking. Wireless Sensor Networks.

Dynamic Bandwidth Allocation for Target Tracking. Wireless Sensor Networks. 4th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, Dynamic Bandwidth Allocation for Target Tracking in Wireless Sensor Networks Engin Masazade, Ruixin Niu, and Pramod

More information

9. Submodular function optimization

9. Submodular function optimization Submodular function maximization 9-9. Submodular function optimization Submodular function maximization Greedy algorithm for monotone case Influence maximization Greedy algorithm for non-monotone case

More information

Transcendence in Positive Characteristic

Transcendence in Positive Characteristic Transcendence in Positive Characteristic Galois Group Examples and Applications W. Dale Brownawell Matthew Papanikolas Penn State University Texas A&M University Arizona Winter School 2008 March 18, 2008

More information

Performance Guarantees for Information Theoretic Active Inference

Performance Guarantees for Information Theoretic Active Inference Performance Guarantees for Information Theoretic Active Inference Jason L. Williams, John W. Fisher III and Alan S. Willsky Laboratory for Information and Decision Systems and Computer Science and Artificial

More information

Distributed MAP probability estimation of dynamic systems with wireless sensor networks

Distributed MAP probability estimation of dynamic systems with wireless sensor networks Distributed MAP probability estimation of dynamic systems with wireless sensor networks Felicia Jakubiec, Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania https://fling.seas.upenn.edu/~life/wiki/

More information

Online monitoring of MPC disturbance models using closed-loop data

Online monitoring of MPC disturbance models using closed-loop data Online monitoring of MPC disturbance models using closed-loop data Brian J. Odelson and James B. Rawlings Department of Chemical Engineering University of Wisconsin-Madison Online Optimization Based Identification

More information

Robust multichannel sparse recovery

Robust multichannel sparse recovery Robust multichannel sparse recovery Esa Ollila Department of Signal Processing and Acoustics Aalto University, Finland SUPELEC, Feb 4th, 2015 1 Introduction 2 Nonparametric sparse recovery 3 Simulation

More information

Adaptive Modulation with Smoothed Flow Utility

Adaptive Modulation with Smoothed Flow Utility Adaptive Modulation with Smoothed Flow Utility Ekine Akuiyibo Stephen Boyd Electrical Engineering Department, Stanford University ITMANET 05/24/10 The problem choose transmit power(s) and flow rate(s)

More information

Least Squares Estimation Namrata Vaswani,

Least Squares Estimation Namrata Vaswani, Least Squares Estimation Namrata Vaswani, namrata@iastate.edu Least Squares Estimation 1 Recall: Geometric Intuition for Least Squares Minimize J(x) = y Hx 2 Solution satisfies: H T H ˆx = H T y, i.e.

More information

Real Time Value Iteration and the State-Action Value Function

Real Time Value Iteration and the State-Action Value Function MS&E338 Reinforcement Learning Lecture 3-4/9/18 Real Time Value Iteration and the State-Action Value Function Lecturer: Ben Van Roy Scribe: Apoorva Sharma and Tong Mu 1 Review Last time we left off discussing

More information

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved. Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 1 Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should

More information

Accounting for Hyperparameter Uncertainty in SAE Based on a State-Space Model: the Dutch Labour Force Survey Oksana Bollineni-Balabay, Jan van den

Accounting for Hyperparameter Uncertainty in SAE Based on a State-Space Model: the Dutch Labour Force Survey Oksana Bollineni-Balabay, Jan van den Accounting for Hyperparameter Uncertainty in SAE Based on a State-Space Model: the Dutch Labour Force Survey Oksana Bollineni-Balabay, Jan van den Brakel, Franz Palm The Dutch LFS monthly estimates for

More information

Identification of Noise Covariances for State Estimation of a Continuous Fermenter using Extended EM Algorithm

Identification of Noise Covariances for State Estimation of a Continuous Fermenter using Extended EM Algorithm Proceedings of the 9th International Symposium on Dynamics and Control of Process Systems (DYCOPS 200, Leuven, Belgium, July 5-7, 200 Mayuresh Kothare, Moses Tade, Alain Vande Wouwer, Ilse Smets (Eds.

More information

Data Fusion Kalman Filtering Self Localization

Data Fusion Kalman Filtering Self Localization Data Fusion Kalman Filtering Self Localization Armando Jorge Sousa http://www.fe.up.pt/asousa asousa@fe.up.pt Faculty of Engineering, University of Porto, Portugal Department of Electrical and Computer

More information

Optimal Sensor Placement and Scheduling with Value of Information for Spatio-Temporal Infrastructure System Management

Optimal Sensor Placement and Scheduling with Value of Information for Spatio-Temporal Infrastructure System Management IASSAR Safety, Reliability, Risk, Resilience and Sustainability of Structures and Infrastructure 12th Int. Conf. on Structural Safety and Reliability, Vienna, Austria, 6 10 August 2017 Christian Bucher,

More information

EE595A Submodular functions, their optimization and applications Spring 2011

EE595A Submodular functions, their optimization and applications Spring 2011 EE595A Submodular functions, their optimization and applications Spring 2011 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Winter Quarter, 2011 http://ee.washington.edu/class/235/2011wtr/index.html

More information

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria 12. LOCAL SEARCH gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley h ttp://www.cs.princeton.edu/~wayne/kleinberg-tardos

More information