Identification of Finite Lattice Dynamical Systems

Similar documents
MORPHOLOGICAL IMAGE ANALYSIS III

Chapter III: Opening, Closing

Feature Selection based on the Local Lift Dependence Scale

Multimedia Databases. Wolf-Tilo Balke Philipp Wille Institut für Informationssysteme Technische Universität Braunschweig

Multimedia Databases. Previous Lecture. 4.1 Multiresolution Analysis. 4 Shape-based Features. 4.1 Multiresolution Analysis

Linear Dynamical Systems (Kalman filter)

Multimedia Databases. 4 Shape-based Features. 4.1 Multiresolution Analysis. 4.1 Multiresolution Analysis. 4.1 Multiresolution Analysis

Chapter I : Basic Notions

Chapter VI. Set Connection and Numerical Functions

CHARACTERIZATION OF LINEAR AND MORPHOLOGICAL OPERATORS. Gerald Jean Francis Banon

Undirected Graphical Models

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

On the Convergence of the Concave-Convex Procedure

Probabilistic Graphical Models Lecture Notes Fall 2009

Part I : Bases. Part I : Bases

Undirected Graphical Models: Markov Random Fields

Bayesian Networks Inference with Probabilistic Graphical Models

Part II : Connection

Dynamical Modeling in Biology: a semiotic perspective. Junior Barrera BIOINFO-USP

Logistic Regression: Online, Lazy, Kernelized, Sequential, etc.

TTIC An Introduction to the Theory of Machine Learning. Learning from noisy data, intro to SQ model

Learning with Noisy Labels. Kate Niehaus Reading group 11-Feb-2014

The Origin of Deep Learning. Lili Mou Jan, 2015

Course 495: Advanced Statistical Machine Learning/Pattern Recognition

Auxiliary Particle Methods

Linear Dynamical Systems

PAC-learning, VC Dimension and Margin-based Bounds

Pure Patterns and Ordinal Numbers

Multiresolution Analysis

STA 4273H: Statistical Machine Learning

Mathematical Morphology and Distance Transforms

Sparse linear models

CS 173 Lecture 2: Propositional Logic

On Input Design for System Identification

Mathematical Morphology

Probabilistic Graphical Models

Mathematical Morphology on Hypergraphs: Preliminary Definitions and Results

Regional Solution of Constrained LQ Optimal Control

Towards a Theory of Information Flow in the Finitary Process Soup

Pattern Recognition and Machine Learning

ELEC system identification workshop. Subspace methods

The Art of Sequential Optimization via Simulations

Introduction to Machine Learning (67577) Lecture 3

Sparse Approximation and Variable Selection

Introduction to Support Vector Machines

Gentle Introduction to Infinite Gaussian Mixture Modeling

Machine Learning 4771

The sample complexity of agnostic learning with deterministic labels

Chapter VIII : Thinnings

CS4495/6495 Introduction to Computer Vision. 8C-L3 Support Vector Machines

Graphical models for part of speech tagging

Adversarial Surrogate Losses for Ordinal Regression

A Generative Perspective on MRFs in Low-Level Vision Supplemental Material

Introduction to Machine Learning

Conditional Independence and Factorization

Topological dynamics: basic notions and examples

Kernel Methods and Support Vector Machines

Learning from Corrupted Binary Labels via Class-Probability Estimation

MACHINE LEARNING 2 UGM,HMMS Lecture 7

Map automatic scale reduction by means of mathematical morphology

Markov Random Fields

Introduction to Machine Learning

Machine Learning. Support Vector Machines. Manfred Huber

Multiple Kernel Learning

arxiv: v1 [math.lo] 30 Aug 2018

CSC 412 (Lecture 4): Undirected Graphical Models

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

Directed and Undirected Graphical Models

BOOLEAN ALGEBRA INTRODUCTION SUBSETS

Morphology Gonzalez and Woods, Chapter 9 Except sections 9.5.7, 9.5.8, and Repetition of binary dilatation, erosion, opening, closing

Machine Learning 2nd Edition

Algebraic Geometry: Limits and Colimits

Poisson line processes. C. Lantuéjoul MinesParisTech

The Poisson storm process Extremal coefficients and simulation

Chapter IV : Morphological Filtering

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

A Nonlocal p-laplacian Equation With Variable Exponent For Image Restoration

( nonlinear constraints)

4 Matrix product states

HST.582J/6.555J/16.456J

The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan

Learning Theory. Ingo Steinwart University of Stuttgart. September 4, 2013

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS

Time Series Classification

Optimal input design for nonlinear dynamical systems: a graph-theory approach

Computational and Statistical Learning Theory

ECE661: Homework 6. Ahmed Mohamed October 28, 2014

Bisimulation for Neighbourhood Structures

Chapter 7 Wavelets and Multiresolution Processing

Symmetry insights for design of supercomputer network topologies: roots and weights lattices.

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course

Generative Models for Sentences

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

Data Retrieval and Noise Reduction by Fuzzy Associative Memories

Boolean Algebras, Boolean Rings and Stone s Representation Theorem

Large subsets of semigroups

Gaussian discriminant analysis Naive Bayes

Gaussian with mean ( µ ) and standard deviation ( σ)

Mask Connectivity by Viscous Closings: Linking Merging Galaxies without Merging Double Stars

From perceptrons to word embeddings. Simon Šuster University of Groningen

Transcription:

Identification of Finite Lattice Dynamical Systems Junior Barrera BIOINFO-USP University of São Paulo, Brazil

Outline Introduction Lattice operator representation Lattice operator design Lattice dynamical systems (LDS) A simulator for chain dynamical systems LDS identification System identification examples Conclusion

Introduction

Dynamical Systems initial state x 2 = y 2 t t u u Final state Final state x 1 = y 1

Sequential Switching Circuits

Mathematical Morphology studies operators between complete lattices, what includes switching functions lattice operators are decomposed in terms of simple morphological operators: erosion, dilation, anti-erosion, anti-dilation Any lattice operator can be decomposed in a canonical morphological representation

Lattice Dynamical Systems We present the notion of Lattice Dynamical System (LDS) Give a representation for LDSs, based on canonical morphological representations Formalize the problem of statistical identification of LDSs,

Operator Representation

ψ : Fun[W, L] L 2 1 0-1 -2 0 1 2 2 2 0 1 2 2 2-1 1 2 2 2-1 1 1 1 1-2 -1-1 -1-1 -2-1 0 1 2

Intervals Let a, b Fun[W,L], a b iff a(x) b(x), x W W = 2 a b Interval [a,b] = {u Fun[W,L]: a u b}

Binary Sup-generating Sup-generating operator: ( u) = 1 u [ a, ] λ a, b b [a,b] 0 0 0 0 0 0 0 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 0 0 0 λ a,b

Kernel Kernel of ψ at y: K(ψ)(y) = {u Fun[W,L]: y ψ(u)} 2 1 0-1 -2 0 1 2 2 2 0 1 2 2 2-1 1 2 2 2-1 1 1 1 1-2 -1-1 -1-1 -2-1 0 1 2 K(ψ)(-2) K(ψ)(-1) K(ψ)(0) K(ψ)(1) K(ψ)(2)

Basis Basis of ψ at y: B(ψ) is the set of maximal intervals contained in K(ψ) K(ψ)(-2) K(ψ)(-1) K(ψ)(0) K(ψ)(1) K(ψ)(2) B(ψ)(-2) B(ψ)(-1) B(ψ)(0) B(ψ)(1) B(ψ)(2)

Canonical Representation ( u) = U{ y M : { ( ) :[, ] ( )( )} 1}, u a b B y = ψ U λ a ψ b K(ψ)(-2) K(ψ)(-1) K(ψ)(0) K(ψ)(1) K(ψ)(2) ψ(-1,-1) = 1

Operator Design

Optimization problem Design goal is to find a function ψ opt : Fun[W, L] L with minimum error. Error (expected loss) of a function : Er(ψ) = E[l(ψ(X),Y)] Loss function X is a random function Y is a random variable l : L L R +

Estimation problem The distribution P(X,Y) is unknown P(X,Y), Er(ψ) and ψ opt should be estimated from realizations of X and Y. For m > m( ε, δ ) examples Pr( Er(ψ) - Er(ψ opt ) < ε) > 1 - δ ε, δ (0,1)

The constrained estimation problem Error ψ opt Ψ Sample size

Stack filter Spatially translation invariant Spatially locally defined Increasing Commutes with threshold

Noise elimination training images

test image iteration 1

test image iteration 5

test image iteration 1

test image iteration 5

Apertures Spatially translation invariant Spatially locally defined Range translation invariant Range locally defined

Deblurring training images

Resolution Enhancement

(0,0) (0,1) Ψ 2 Ψ 0 Ψ 1 Ψ 3

Original Aperture: 3x3x21x51 Linear Bilinear

Zoom Original Aperture: 3x3x21x51 Linear Bilinear

Independent Constraints Constraints Restriction of the operators space K(ψ opt ) Q P(P(W)) Independent Constraint Let be A,B P(W) with A B: h ψ (x)=1 x A & h ψ (x)=0 x B, ψ :K(ψ) Q P(P(W)) K(ψ opt ) {1} B Q A {0,1} Q B P(W) {0} P(P(W)) A

Independent Constraints Proposition: if Q is an independent restriction then exist a par of operators (α,β) such that, for any ψ Ψ W K(ψ) Q α ψ β where K(α) = A and K(β) = B All independent constraint is characterized by two operators α and β The pair (α,β) is called Envelope h α {1} h β {1} A {0} A {1} B {0} B {0} P(W) P(W)

Noise Edge Detection Ground Image Noise Addition Noisy Image Restoration Filtered Image Edge Detection Edge Detected Direct Edge Detection Edge Detected

Restoration a) Machine design of the restoration ψ pac designed by examples b) Human-machine design of the restoration ψ con = (ψ pac β) α α = δ B B ε B B δ B ε B and β = ε B B δ B B ε B δ B α and β are alternating sequential filters with P[ α(s) I β(s) ] 1 B is the 3x3 square Machine design of the restoration Human-Machine design of the restoration 0.28 % 0.13 % 0,3 0,25 0,2 0,15 0,1 0,05 Machine design of the restoration Human design of the restoration 0 Error (%)

Noise Edge Detection Edge Detection a) Machine design over noisy images ζ pac designed by examples from noisy images b) Human design after restoration ζ = I d - ε B B is the 3x3 square c) Machine design after restoration ζ pac designed by examples from restored images Machine design over noisy images Human design after restoration Machine design after restoration 0.65 % 0.27 % 0.24 % 0,7 0,6 0,5 0,4 0,3 0,2 0,1 0 Error (%) Machine design over noisy images Human design after restoration Machine design after restoration

Noise Edge Detection Machine design over noisy images Error = 0.65% Human design after restoration Error = 0.27% Machine design after restoration Error = 0.24%

Multiresolution Constraint x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 8 variables z 1 z 2 z 3 z 4 4 variables w 1 w 2 2 variables 0 1 0 1 00 01 10 11 0 0 0 1 1 0 1 1 0000 0001 0010 0011 0100 0101 0110 0111 1000 1001 1010 1011 1100 1101 1110 1111 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1 0 1 0 0 0 1 0 1 0 1 1 0 0 1 1 1 1 0 0 0 1 0 0 1 1 0 1 0 1 0 1 1 1 1 0 0 1 1 0 1 1 1 1 0 1 1 1 1 2 variables: 2 2 =4 4 variables: 2 4 =16 8 variables: 2 8 =256

Multiresolution Constraint D 1 = P(W 1 ) D 0 = P(W 0 ) W 0 z i = p i (x i1,..., x i9 ), z = p(x), p=(p 1,.. p 9 ) Let φ:d 1 {0,1}, it defines the operator Ψ φ on D o by Ψ φ (x) = φ(p(x)) W 1 The operador Ψ φ is constrained by resolution to D 1 Equivalence classes defined by p(x) = p(y) D 0

Multiresolution Constraint W 1 p D 2 = P(W 2 ), D 1 = P(W 1 ), D 0 = P(W 0 ) W x D 0, z D 1, v D 2, 0 z i = p i (x i,1,..., x i,9 ), z = p(x), p=(p 1,.. p 81 ) v i = w i (x i,1,..., x i,81 ), v = w(x), w=(w 1,.. w 9 ) W 2 w The equivalence classes defined by p may be different by the ones defined by w.

Multiresolution Noise image noise image + noise

3x3 window Pyramid Restauration Persisting noise

Dynamical Systems

Finite Lattice Dinamical System

Representation The component functions have canonical morphological representations

System: input-output

Filter For example, processing of motion images.

Input-free systems

Causal systems

Time translation invariant systems

Causal, time translation invariant

A simulator for chain dynamical systems

Simulator Architecture

Functions Representation

System Description

Cell cycle simulation Division Steps Ingredients Control External Signal Inhibition p53 Receptors Excitation Excitation or Inhibition

A model for system identification

Model vector of functions S( Φ, Ψ) S opt ( Φ, Ψ) each component function has a basis S

System Error

Stationary Conditions

Component Error

Additive Loss Function

Independence Condition Under additive loss function optimize the system error is equivalent to optimize the system components error The problem of system identification is reduced to a family of problems of lattice operator design.

Identification of Dynamical Systems

A Boolean System

System simulation

System identification: system error

System identification: transition error

Ideal Est.: 100 training examples Est.: 500 training examples Est.: 1500 training examples

Motion Segmentation

Mask Predictor result

Filtering Color Composition

Watershed Color Composition

Motion Segmentation

Motion Segmentation

Conclusion

Presented the notion of Lattice Dynamical System Proposed a model for LDS identification Under additive condition, system identification reduces to a family of problems of lattice operator design Some examples were presented This perspective unifies theories such as switching theory, discrete automatic control and reinforcement learning.