An Algebraic Generalization for Graph and Tensor-Based Neural Networks

Similar documents
13th International Conference on Relational and Algebraic Methods in Computer Science (RAMiCS 13)

Stéphane Lafortune. August 2006

Neuroevolution for sound event detection in real life audio: A pilot study

Introduction to Kleene Algebras

Pytorch Tutorial. Xiaoyong Yuan, Xiyao Ma 2018/01

Forecasting & Futurism

DRAFT CONCEPTUAL SOLUTION REPORT DRAFT

The Effect of Connection Cost on Modularity in Evolved Neural Networks

ECE521 W17 Tutorial 1. Renjie Liao & Min Bai

Artificial Neural Network and Fuzzy Logic

Introduction to Deep Learning CMPT 733. Steven Bergner

Lecture 2: Linear regression

@SoyGema GEMA PARREÑO PIQUERAS

Chapter 1 Introduction

Notes on Back Propagation in 4 Lines

EE-559 Deep learning 1.4. Tensor basics and linear regression. François Fleuret Mon Mar 18 20:04:34 UTC 2019

Overview. Systematicity, connectionism vs symbolic AI, and universal properties. What is systematicity? Are connectionist AI systems systematic?

Learning Deep Architectures for AI. Part II - Vijay Chakilam

Human Brain Networks. Aivoaakkoset BECS-C3001"

NEURAL LANGUAGE MODELS

Neural Network Identification of Non Linear Systems Using State Space Techniques.

Principles of Modularity, Regularity, and Hierarchy for Scalable Systems

ECE521 Lecture 7/8. Logistic Regression

RESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln

Lecture 4 Backpropagation

RELATION ALGEBRAS. Roger D. MADDUX. Department of Mathematics Iowa State University Ames, Iowa USA ELSEVIER

Dynamic Working Memory in Recurrent Neural Networks

Sri vidya college of engineering and technology

Neural Networks Teaser

SGD and Deep Learning

Address for Correspondence

Every Binary Code Can Be Realized by Convex Sets

On the Splitting of MO(2) over the Steenrod Algebra. Maurice Shih

DM534 - Introduction to Computer Science

Lecture 8: Introduction to Deep Learning: Part 2 (More on backpropagation, and ConvNets)

Evolving Keras Architectures for Sensor Data Analysis

Multi-Dimensional Neural Networks: Unified Theory

x y = 1, 2x y + z = 2, and 3w + x + y + 2z = 0

Matrix Algebra Lecture Notes. 1 What is Matrix Algebra? Last change: 18 July Linear forms

Approximation Algorithms using Allegories and Coq

(Artificial) Neural Networks in TensorFlow

Introduction to TensorFlow

Basics of Relation Algebra. Jules Desharnais Département d informatique et de génie logiciel Université Laval, Québec, Canada

Deep Learning Autoencoder Models

Spiking Neural P Systems and Modularization of Complex Networks from Cortical Neural Network to Social Networks

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Reification of Boolean Logic

Theano: A Few Examples

CSE 1400 Applied Discrete Mathematics Definitions

Efficient random number generation on FPGA-s

von Neumann Architecture

Functional Connectivity and Network Methods

Géza Ódor, MTA-EK-MFA Budapest Ronald Dickman, UFMG Brazil 08/04/2015

The Reflection Theorem

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Algebraic Transformations of Convex Codes

Kleene Algebras and Algebraic Path Problems

Stephen Scott.

Tensor Flow. Tensors: n-dimensional arrays Vector: 1-D tensor Matrix: 2-D tensor

Supporting Information

k k k 1 Lecture 9: Applying Backpropagation Lecture 9: Applying Backpropagation 3 Lecture 9: Applying Backpropagation

biologically-inspired computing lecture 5 Informatics luis rocha 2015 biologically Inspired computing INDIANA UNIVERSITY

Some distance functions in knot theory

Backpropagation and Neural Networks part 1. Lecture 4-1

34.1 Polynomial time. Abstract problems

Network Biology: Understanding the cell s functional organization. Albert-László Barabási Zoltán N. Oltvai

DEVS Simulation of Spiking Neural Networks

Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści

Estimation of the Pre-Consolidation Pressure in Soils Using ANN method

T (s, xa) = T (T (s, x), a). The language recognized by M, denoted L(M), is the set of strings accepted by M. That is,

WHY ARE DEEP NETS REVERSIBLE: A SIMPLE THEORY,

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Several ways to solve the MSO problem

Parameterizing orbits in flag varieties

Fun with Semirings. A functional pearl on the abuse of linear algebra. Stephen Dolan

Abstract Interpretation from a Topological Perspective

Neural Codes and Neural Rings: Topology and Algebraic Geometry

CSC321 Lecture 5: Multilayer Perceptrons

Deep Learning: a gentle introduction

TensorFlow: A Framework for Scalable Machine Learning

The Changing Landscape of Land Administration

Backpropagation: The Good, the Bad and the Ugly

Machine Learning. Boris

Handbook of Logic and Proof Techniques for Computer Science

Deep Belief Networks are compact universal approximators

Data Mining Part 5. Prediction

Relationships Between Planes

Partial model checking via abstract interpretation

PV021: Neural networks. Tomáš Brázdil

Non-Adaptive Evolvability. Jeff Clune

arxiv: v1 [cs.sy] 25 Oct 2017

Midterm for CSC321, Intro to Neural Networks Winter 2018, night section Tuesday, March 6, 6:10-7pm

DISTINGUISHING PARTITIONS AND ASYMMETRIC UNIFORM HYPERGRAPHS

Outline Python, Numpy, and Matplotlib Making Models with Polynomials Making Models with Monte Carlo Error, Accuracy and Convergence Floating Point Mod

6.003: Signals and Systems

Deep Learning for NLP

Linear Equations in Linear Algebra

Catalytic Networks. Mark Baumback

Deep Spatio-Temporal Time Series Land Cover Classification

Transcription:

An Algebraic Generalization for Graph and Tensor-Based Neural Networks CIBCB 2017 Manchester, U.K. Ethan C. Jackson 1, James A. Hughes 1, Mark Daley 1, and Michael Winter 2 August 24, 2017 The University of Western Ontario London, Canada 1 Brock University St. Catharines, Canada 2

Motivation

Neural Network Representations No current standard language or mathematical framework for neural networks no standard representation. 1

Neural Network Representations No current standard language or mathematical framework for neural networks no standard representation. In computational neuroscience: CSA Connection Set Algebra [5] NeuroML XML-based model description [6] Custom representations 1

Neural Network Representations No current standard language or mathematical framework for neural networks no standard representation. In computational neuroscience: CSA Connection Set Algebra [5] NeuroML XML-based model description [6] Custom representations In computer science: Tensor-based TensorFlow[1], Theano[12], Keras [2] Graph-based NEAT, HyperNEAT [7] Custom representations 1

Evolutionary Computation NEAT Architecture and parameters evolved simultaneously Connections are encoded point-to-point Tensor Frameworks Architectures typically hand-designed High level interfaces increasingly available Evolutionary algorithms beginning to be used [4] 2

Connectomics Connectomics is concerned with studying brain function in relation to network organization and dynamics. See work by O. Sporns [10], [11]. Self-similarity, fractal-like patterns At multiple levels of organization The human genome does not encode each neural connection, point-to-point it does not encode an adjacency list. 3

Goals Find a common mathematical framework for existing tools that provides: A universal network description language Modular, generative operations Symbolic reasoning Model portability Self-similar or fractal-like representation 4

Example Feed-Forward Network Architecture A feed-forward neural network architecture. 5

Example Graph and Tensor T 1 = R 1 ι F1 F 1 F 2 + R 2 κ F2 F 1 F 2 T 2 = π F1 F 2 F 1 R 3 ι A1 A 1 B 1 + ρ F1 F 2 F 2 R 5 κ B1 A 1 B 1 T 3 = π A1 B 1 A 1 R 4 ι A2 A 2 B 2 + ρ A1 B 1 B 1 R 6 κ B2 A 2 B 2 T 4 = π A2 B 2 A 2 R 7 + ρ A2 B 2 B 2 R 8 T 5 = R 9 eval(r 1... R 9 ) : In Out := T 1 T 2 T 3 T 4 T 5 Example network as a HyperNEAT genome. Example network as a Tensor expression. 6

Example Network, Module, and Substitution + The substitution of network connections by a repeated module. 7

Example Matrix In F 1 F 2 A 1 A 2 B 1 B 2 C Out In R 1 R 2 F 1 R 3 F 2 R 5 A 1 R 4 A 2 R 7 B 1 R 6 B 2 R 8 C R 9 Out A relational matrix describing a high-level neural network architecture. 8

Background

Connection Set Algebra CSA is a mathematical framework for compactly describing neural architectures. Masks Value sets Elementary connection sets Pseudo -algebraic operations Functions for establishing connectivity 9

Connection Set Algebra CSA is a mathematical framework for compactly describing neural architectures. Masks Boolean relations Value sets Real-valued vectors Elementary connection sets Elementary relations Pseudo -algebraic operations No formal properties Functions for establishing connectivity Incomplete set 9

Relations Classical or Boolean-valued relations R A B R : A B {0, 1} Often interpreted as Boolean-valued matrices, e.g.: B 1 B 2... B n A 1 1 0 0 A 2 1 1 1...... A n 1 0 1 10

Relation Algebraic Operations A relation algebra is an extended, residuated Boolean algebra. Elementary relation algebraic operations: Union* Intersection* Composition* Complement* Difference* Converse *Corresponds to set-theoretic operation. Converse reverses order of all pairs. 11

Semiring Matrices A semiring is a very general algebraic structure. Its elements can be used to model connectivity between neurons. R = (R, +,, 0, 1) with the standard addition and multiplication operations forms a semiring. A matrix over the semiring R: 1.2 8.2 0 0.1 9.9 1.2... 3.4 0.4 1 12

A Unified Approach A relation algebra is embedded in the matrix algebra over a large class of semirings that includes the field R and functions over R. Requires minimal extension a flattening operation Connectivity and values can be modelled using a single object A basic descriptive framework can be implemented as an extension of linear algebra Algebraic properties, other operations for free 13

Algebraic Framework and Extended Operations

Core Algebraic Framework The core algebraic framework consists of matrices over the field R within which relations are the matrices with {0, 1}-valued entries. We use special matrices called relational sums to compose modular network architectures. N [ N R ] [ N B N R 0 A 0 0 ] Application of operation inject-top-left to a matrix R. Left: a matrix N N. Right: a matrix denoted by N A N B. 14

Extended Operations Substitution Using relational sums, we implemented extended operations to build modular networks. Substitution of connections by a module. The resulting network could in turn be used as another substitution module. 15

Extended Operations Substitution Because relations are embedded in the matrix algebra, we can conveniently use matrices and relational reasoning to define operations. N 1 N 2 N 3 C 1 C 2 C 3 C 4 C 5 N 1 1 1 N 2 1 N 3 1 C 1 0.2 0.3 0.6 C 2 C 3 0.2 C 4 0.9 C 5 1 0.1 Substitution of one connection by a module Can prove that connectedness is preserved 16

Extended Operations Substitution Algebraic definition and reasoning about matrices using relational methods. Definition Let N and C be finite sets, Net : N N and S : C C be relations, and (N x, N y ) Net. The substitution operation is defined by: subst(net, S, (N x, N y )) := itl(net {(N x, N y )}, C, C) ibr(s, N, N) {(N x, C 1 ), (C n, N y )} Lemma Let N and C be finite sets, Net : N N and S : C C be relations, and (N x, N y ) be an element in Net. If (C 1, C n) S + then (N x, N y ) NetS +, the transitive closure of NetS, where NetS = subst(net, S, (N x, N y )). 17

Summary In our paper we show that: Neural network architecture can be described and manipulated using existing and custom relational operations Operations are generic with respect to the set used to describe neural connectivity Properties about operations can be proven using relation algebraic methods 18

Next Steps

Models will become more like programs... will be grown automatically. [3] François Chollet, author of Keras 19

Modular Neuroevolution In F 1 F 2 A 1 A 2 B 1 B 2 C Out In R 1 R 2 F 1 R 3 F 2 R 5 A 1 R 4 A 2 R 7 B 1 R 6 B 2 R 8 C R 9 Out R : N N N = In F 1 F 2 A 1 A 2 B 1 B 2 C Out 20

Modular Neuroevolution A relational representation neural networks could have several advantages over other approaches. Multiple levels of organization Algebraic manipulation Conventional ANN modules Relational and functional modules 21

Modular Neuroevolution We are currently working on a neuroevolution framework based on Cartesian genetic programming (CGP) and existing work on CGP-based ANNs. 22

Modular Neuroevolution We are currently working on a neuroevolution framework based on Cartesian genetic programming (CGP) and existing work on CGP-based ANNs. The algebraic framework introduced in this work will be the basis for the genetic representation and operators. 22

Modular Neuroevolution We are currently working on a neuroevolution framework based on Cartesian genetic programming (CGP) and existing work on CGP-based ANNs. The algebraic framework introduced in this work will be the basis for the genetic representation and operators. Evolved networks will be a mix of de novo evolved modules and existing modules in the form of ANN layers, relational, and functional programs. 22

Modular Neuroevolution We are currently working on a neuroevolution framework based on Cartesian genetic programming (CGP) and existing work on CGP-based ANNs. The algebraic framework introduced in this work will be the basis for the genetic representation and operators. Evolved networks will be a mix of de novo evolved modules and existing modules in the form of ANN layers, relational, and functional programs. The representation will be based on a mapping between algebraic expressions and a recursive, modular adjacency structure. 22

Thank you! 22

References i Abadi, M., et. al.: TensorFlow: Large-scale machine learning on heterogeneous systems, (2015). Chollet, F. et. al.: Keras. github.com/fchollet/keras/ (2015). Chollet, F.: The Future of Deep Learning. https://blog.keras.io/the-future-of-deep-learning.html (2017). Davison, J.: DEvol - Automated deep neural network design with genetic programming. https://github.com/joeddav/devol (2017). 23

References ii Djurfeldt, M.: The Connection-set Algebra A Novel Formalism for the Representation of Connectivity Structure in Neuronal Network Models. Neuroinformatics 10(3), 1539-2791 (2012). Gleeson, P. and Crook, S. and Cannon, R. C. and Hines, M. L. and Billings, G. O. et al.: NeuroML: A Language for Describing Data Driven Models of Neurons and Networks with a High Degree of Biological Detail. PLoS Computational Biology 6(6) (2010). HyperNEAT: eplex.cs.ucf.edu/hyperneatpage/ Jones, E., Oliphant, E., Peterson, P., et al.: SciPy: Open Source Scientific Tools for Python, scipy.org/ (2001 ). 24

References iii Killingbeck, D., Santos Teixeira, M., and Winter, M.: Relations among Matrices over a Semiring. In Kahl, W., Oliveira, J.N., Winter, M. (Eds.): RAMiCS 15. LNCS 9348, 96-113 (2015). Sporns, O.: Discovering the Human Connectome. MIT Press (2012). Sporns, O.: Small-world connectivity, motif composition, and complexity of fractal neuronal connections. BioSystems 85 5564 (2006). Theano Development Team: Theano: A Python framework for fast computation of mathematical expressions. arxiv.org/abs/1605.02688 (2016). 25

References iv Van der Walt, S., Colbert, S.C., and Varoquaux, G.: The NumPy Array: A Structure for Efficient Numerical Computation, Computing in Science & Engineering, 13, 22-30 (2011) 26