Persistent Chaos in High-Dimensional Neural Networks

Similar documents
Structural Stability and Hyperbolicity Violation in High-Dimensional Dynamical Systems

Dynamic Stability of High Dimensional Dynamical Systems

Structural Stability and Hyperbolicity Violation in High- Dimensional Dynamical Systems

ABOUT UNIVERSAL BASINS OF ATTRACTION IN HIGH-DIMENSIONAL SYSTEMS

Essential hyperbolicity versus homoclinic bifurcations. Global dynamics beyond uniform hyperbolicity, Beijing 2009 Sylvain Crovisier - Enrique Pujals

Structural stability and hyperbolicity violation in high-dimensional dynamical systems

B5.6 Nonlinear Systems

One dimensional Maps

Abundance of stable ergodicity

Problems in hyperbolic dynamics

Routes to chaos in high-dimensional dynamical systems: A qualitative numerical study

Periodic Sinks and Observable Chaos

Routes to chaos in high-dimensional dynamical systems: A qualitative numerical study

Hyperbolic Dynamics. Todd Fisher. Department of Mathematics University of Maryland, College Park. Hyperbolic Dynamics p.

The Structure of Hyperbolic Sets

Abundance of stable ergodicity

HYPERBOLIC SETS WITH NONEMPTY INTERIOR

Robustly transitive diffeomorphisms

Example Chaotic Maps (that you can analyze)

Mathematical Foundations of Neuroscience - Lecture 7. Bifurcations II.

April 13, We now extend the structure of the horseshoe to more general kinds of invariant. x (v) λ n v.

A geometric approach for constructing SRB measures. measures in hyperbolic dynamics

Mechanisms of Chaos: Stable Instability

LECTURES ON PARTIAL HYPERBOLICITY AND STABLE ERGODICITY

Introduction to Dynamical Systems Basic Concepts of Dynamics

WHAT IS A CHAOTIC ATTRACTOR?

HAMILTONIAN ELLIPTIC DYNAMICS ON SYMPLECTIC 4-MANIFOLDS

arxiv: v1 [math.ds] 16 Nov 2010

ABSOLUTE CONTINUITY OF FOLIATIONS

1 Introduction Definitons Markov... 2

A stochastic view of Dynamical Systems

SRB measures for non-uniformly hyperbolic systems

OPEN PROBLEMS IN THE THEORY OF NON-UNIFORM HYPERBOLICITY

Smooth Ergodic Theory and Nonuniformly Hyperbolic Dynamics

PHY411 Lecture notes Part 5

Introduction Knot Theory Nonlinear Dynamics Topology in Chaos Open Questions Summary. Topology in Chaos

Characterization of the stability boundary of nonlinear autonomous dynamical systems in the presence of a saddle-node equilibrium point of type 0

A MINIMAL 2-D QUADRATIC MAP WITH QUASI-PERIODIC ROUTE TO CHAOS

A classification of explosions in dimension one

6.2 Brief review of fundamental concepts about chaotic systems

arxiv: v1 [math.ds] 20 Apr 2008

THE GEOMETRIC APPROACH FOR CONSTRUCTING SINAI-RUELLE-BOWEN MEASURES

Chaotic motion. Phys 750 Lecture 9

HYPERBOLIC DYNAMICAL SYSTEMS AND THE NONCOMMUTATIVE INTEGRATION THEORY OF CONNES ABSTRACT

Computed Chaos or Numerical Errors. Lun-Shin Yao

Introduction Hyperbolic systems Beyond hyperbolicity Counter-examples. Physical measures. Marcelo Viana. IMPA - Rio de Janeiro

Entropy of C 1 -diffeomorphisms without dominated splitting

Example of a Blue Sky Catastrophe

Chapter 23. Predicting Chaos The Shift Map and Symbolic Dynamics

PATH CONNECTEDNESS AND ENTROPY DENSITY OF THE SPACE OF HYPERBOLIC ERGODIC MEASURES

(x k ) sequence in F, lim x k = x x F. If F : R n R is a function, level sets and sublevel sets of F are any sets of the form (respectively);

Morse Theory for Lagrange Multipliers

Chaotic motion. Phys 420/580 Lecture 10

Hilbert s 16. problem

Symbolic dynamics and chaos in plane Couette flow

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ).

SINAI S WORK ON MARKOV PARTITIONS AND SRB MEASURES

Coexistence of Zero and Nonzero Lyapunov Exponents

Adapted metrics for dominated splittings

PARTIAL HYPERBOLICITY, LYAPUNOV EXPONENTS AND STABLE ERGODICITY

SRB MEASURES FOR AXIOM A ENDOMORPHISMS

A short introduction with a view toward examples. Short presentation for the students of: Dynamical Systems and Complexity Summer School Volos, 2017

STABLE ERGODICITY FOR PARTIALLY HYPERBOLIC ATTRACTORS WITH NEGATIVE CENTRAL EXPONENTS

NBA Lecture 1. Simplest bifurcations in n-dimensional ODEs. Yu.A. Kuznetsov (Utrecht University, NL) March 14, 2011

arxiv:math/ v1 [math.ds] 28 Apr 2003

Generic family with robustly infinitely many sinks

1 Lyapunov theory of stability

C 1 DENSITY OF AXIOM A FOR 1D DYNAMICS

Dynamics of Group Actions and Minimal Sets

Dynamics of Periodically Perturbed Homoclinic Solutions

Generic family with robustly infinitely many sinks

Stabilization of Hyperbolic Chaos by the Pyragas Method

CHARACTERIZATION OF SADDLE-NODE EQUILIBRIUM POINTS ON THE STABILITY BOUNDARY OF NONLINEAR AUTONOMOUS DYNAMICAL SYSTEM

DYNAMICAL SYSTEMS PROBLEMS. asgor/ (1) Which of the following maps are topologically transitive (minimal,

The Existence of Chaos in the Lorenz System

On dynamical properties of multidimensional diffeomorphisms from Newhouse regions: I

TRIVIAL CENTRALIZERS FOR AXIOM A DIFFEOMORPHISMS

Segment Description of Turbulence

A Tourist s Guide to The General Topology of Dynamical Systems

Introduction to Continuous Dynamical Systems

Nonlinear dynamics & chaos BECS

Reconstruction Deconstruction:

Bifurcations in the Quadratic Map

2000 Mathematics Subject Classification. Primary: 37D25, 37C40. Abstract. This book provides a systematic introduction to smooth ergodic theory, inclu

Dynamical Systems and Chaos Part I: Theoretical Techniques. Lecture 4: Discrete systems + Chaos. Ilya Potapov Mathematics Department, TUT Room TD325

Ergodic Theorems with Respect to Lebesgue

If Λ = M, then we call the system an almost Anosov diffeomorphism.

ROBUST MINIMALITY OF ITERATED FUNCTION SYSTEMS WITH TWO GENERATORS

Hamiltonian Dynamics

Physical Measures. Stefano Luzzatto Abdus Salam International Centre for Theoretical Physics Trieste, Italy.

High-Dimensional Dynamics in the Delayed Hénon Map

11 Chaos in Continuous Dynamical Systems.

Evolution of the McMullen Domain for Singularly Perturbed Rational Maps

Physical measures of discretizations of generic diffeomorphisms

Dynamics and topology of matchbox manifolds

PERIODICITY AND CHAOS ON A MODIFIED SAMUELSON MODEL

Hyperbolicity of Renormalization for C r Unimodal Maps

Chapter 3. Gumowski-Mira Map. 3.1 Introduction

Finding numerically Newhouse sinks near a homoclinic tangency and investigation of their chaotic transients. Takayuki Yamaguchi

Symbolic extensions for partially hyperbolic diffeomorphisms

Transcription:

Persistent Chaos in High-Dimensional Neural Networks D. J. Albers with J. C. Sprott and James P. Crutchfield February 20, 2005 1

Outline: Introduction and motivation Mathematical versus computational dynamics Neural networks: a universal function space Splitting of a parameter interval Qualitative results Dynamics of the bifurcation chain region Future work 2

Understanding the world a la Poincaré Introduction Dissipative versus non-dissipative dynamics. Problems: general qualitative understanding; invariance of measures (i.e. non-equilibrium physics); relationship with nature. The goal is to present a framework for studying the above problems, provide links between mathematical dynamics and the real world. First step, understand the dynamics of our construction. 3

Mathematical dynamical systems: Little changes for d 3 Difference in approaches Parameters are never an issue, the framework is with respect to C k perturbations in the C r Whitney topology Computational dynamical systems The number of dimensions of the dynamical system matters; there is a stark difference between common dynamics in high and low-dimensional dynamical systems. Parameters matter; the practical effects of parameters with respect to dynamical systems is significant. Instead of selecting a manifold to impose dynamics on about which one can classify a generic type behavior, the existence of parameters asks the opposite question: begin with a dynamical system, vary the parameters, and observe the types of manifolds that are present, and study the types of dynamics that are predominant on those manifolds. 4

Computational and low-dimensional intuition Logistic map: real Fatou lemma, Jakobson absolute continuity; Topological variation is dramatic, e.g. transitions from chaotic to periodic orbits; Periodic orbits and windows amidst chaotic orbits are common; 5

Standard logistic map. 1 0.9 0.8 0.7 0.6 x_t 0.5 0.4 0.3 0.2 0.1 0 1.5 2 2.5 3 3.5 4 a 6

Mathematical dynamics intuition Everything is Anosov (cat map) like ( stacks of Anosovs, etc). Topological variation is relatively uncommon (but there are many conflicting stories). (Smale, Palis, Robbin, etc). Periodic windows are likely rare. (Pugh) Many (non-dissipative at the moment) dynamical systems are ergodic a la Boltzmann. (Pugh-Shub) Dissipative dynamical systems are hard to handle (current hope SRB measures). 7

Artificial neural networks Definition 1 A neural network is a C r mapping γ : R n R. The set of feedforward networks with a single hidden layer, Σ(G), can be written: Σ(G) {γ : R d R γ(x) = N β i G( x T ω i )} (1) where x R d, is the d vector of networks inputs, x T (1, x T ) (where x T is the transpose of x), N is the number of hidden units (neurons), β 1,..., β N R are the hidden-to-output layer weights, ω 1,..., ω N R d+1 are the input-to-hidden layer weights, and G : R d R is the hidden layer activation function (or neuron). i=1 x t = β 0 + N d β i G sω i0 + s ω ij x t j (2) i=1 j=1 ω ij N(0, s), β i uniform on [0,1], G tanh(), d = number of inputs, N = number of neurons. 8

Neural networks and the Takens embedding theorem Schematic diagram of the Takens embedding theorem and how it applies to our construction. M F: M M M Abstract (manifold) space 1 g : im(g) M g:m R 2d+1 g:m R 2d+1 ~ F = g F g 1 2d (E(p), (E(F(p)),..., E(F (p)) ε R 2d+1 R 2d+1 Measurement space F is the dynamical system, E : M R (E is a C k map), where E represents some empirical style measurement of F, and g is the Takens s map: g(x t ) = (E(x t ), E(F(x t )),..., E(F 2d (x t ))) (3) 9

What can we approximate with neural networks? Any function from a Sobolev space (Lebesgue integrable functions with weak derivatives). Any function in C r, r 0 (ODE s, maps, etc). Piecewise smooth functions with properly chosen domains. Most PDE s. 10

Splitting of the s parameter interval Bifurcation diagrams along an interval of the s parameter. There exist roughly five regions: the first bifurcation region (I), the routes to chaos region (II), chaos to bifurcation chains (III), bifurcation chain region (IV), bifurcation chains to finite state dynamics (V). Prototypical high-dimensional case is: I II IV V. 11

Bifurcation diagram with the largest Lyapunov exponent for N = 4 and d = 4 with regions I, II, III, and V no region IV. 4 1 3 2 0.5 x_t 1 0-1 Lyapunov Exponent 0-2 -0.5-3 -4 0.0125 0.0328 0.087 0.376 s log(1.01) 1.621 6.999 18.56-1 0.0328 0.087 0.376 s log(1.01) 1.621 6.999 18.56 12

Bifurcation diagram with the largest Lyapunov exponent for N = 4 and d = 64 with regions I, II, III, IV, and V all regions. 4 0.2 3 0.15 2 0.1 x_t 1 0-1 Lyapunov Exponent 0.05 0-0.05-2 -0.1-3 -0.15-4 0.009 0.06 0.417 2.90 19.8-0.2 0.009 0.06 0.417 2.90 19.8 s log(1.01) s log(1.01) 13

Bifurcation diagram with the largest Lyapunov exponent for N = 64 and d = 64 with regions I, II, IV, region V is not displayed. The prototypical scenario 0.2 30 0.15 20 0.1 x_t 10 0-10 Lyapunov Exponent 0.05 0-0.05-20 -0.1-30 -0.15 0.009 0.06 0.417 2.90 19.8-0.2 0.009 0.06 0.417 2.90 19.8 s log(1.01) s log(1.01) 14

General lore over all regions: 1. As the dimension is increased, the probability of a network being chaotic over some portion of its parameter space goes to unity. 2. For the networks considered, at d = 50 there exists an s value such that the maximum probability of chaos is about fifty percent. 3. As the dimension is increased, the Kaplan-Yorke dimension scales like d 2. 4. As the dimension is increased, the probability of the first bifurcation being Neimark-Sacker (bifurcation to quasi-periodic orbit) approaches unity. 5. As the dimension is increased, the dominant route to chaos from a fixed point along a one-dimensional interval is the quasi-periodic route involving tori (T 2 for d = 64); 6. As the dimension is increased, the existence of periodic windows in parameter space scales like 1 d. 7. As the dimension is increased, the number of positive exponents scales like d 4. 15

8. As the dimension is increased, in the chaotic region of parameter space, hyperbolicity is violated on an increasingly countable, dense, Lebesgue measure zero interval in parameter space. 9. As the dimension in increased chaos becomes more robust with respect to parameter changes.

Outline for the numerical arguments for region IV Along the 1-dimensional s-interval formulation of conjecture List of properties we require New definitions Formalization of conjectures Evidence High-dimensional generalization New definitions Formalization of conjectures Evidence 16

Lyapunov characteristic exponent (LCE) spectrum 1. the number of positive exponents = the number of expanding (stretching) directions. 2. the number of negative exponents = the number of contracting (squashing) directions. 17

Positive LE spectrum for typical individual networks with 32 neurons and 16 (left) and 64 (right) dimensions. 0.3 0.1 Positive Lyapunov Exponents 0.25 0.2 0.15 0.1 0.05 Positive Lyapunov Exponents 0.08 0.06 0.04 0.02 0 0 1 2 3 4 5 6 7 8 9 10 0 0 1 2 3 4 5 6 7 8 9 10 s parameter s parameter About the fuz 18

List of properties for 1-dimensional parameter interval picture The following properties must increase with dimension: a. the number of positive exponents; b. the continuity of the exponents relative to parameter change; c. Asymptotic density of transversal Lyapunov exponent zero crossings; 19

Numerical continuity Small variation in the parameter leads to small variation in the LCEs. 20

Diagram for bifurcation chain sets An intuitive diagram for chain link sets, V, bifurcation link sets, V i, and bifurcation chain sets, U. for an LCE decreasing chain link set V. U = {a } V = i i V i a a a a a 1 2 3 4 5 V V V V 1 2 3 4 21

Definition 2 (Chain link set) Assume f is a mapping (neural network) as previously defined. A chain link set is denoted: V = {s R χ j (s) 0 for all 0 < j d and χ j (s) > 0 for some j > 0} Next, let C k be a connected component of the closure of V, V. It can be shown that C k V is a union of disjoint, adjacent open intervals of the form i (a i, a i+1 ). Definition 3 (Bifurcation link set) Assume f is a mapping (neural network) as previously defined. Denote a bifurcation link set of C k V as: V i = (a i, a i+1 ) (4) Definition 4 (Bifurcation chain subset) Let V be a chain link set, and C k a connected component of V. A bifurcation chain subset of C k V is denoted: or equivalently: U k = {a i } (5) U k = (C k V ) (6) 22

Definition 5 (ɛ-dense) Given an ɛ > 0, an open interval (a, b) R, and a sequence {c 1,..., c n }, {c 1,..., c n } is ɛ-dense in (a, b) if there exists an n such that for any x (a, b), there is an i, 1 i < n, such that dist(x, c i ) < ɛ. However, we really want a sequence of sets: where n i+1 > n i c 1 1,, c1 n 1 c 2 1,, c2 n 2... Definition 6 (Asymptotically Dense (a dense)) A sequence S j = {c j 1,..., cj n j } (a, b) of finite subsets is asymptotically dense in (a, b), if for any ɛ > 0, there is an N such that S j is ɛ-dense if j N. 23

Two conjectures Conjecture 1 (Existence of a Codimension ɛ bifurcation set) Assume f is a mapping (neural network) as previously defined with a sufficiently high number of dimensions, d, and a bifurcation chain set U as previously mentioned. The two following (equivalent) statements hold: i. In the infinite-dimensional limit, the cardinality of U will go to infinity, and the length max a i+1 a i for all i will tend to zero on a one dimensional interval in parameter space. In other words, the bifurcation chain set U will be a dense in its closure, U. ii. In the asymptotic limit of high dimension, for all s U, and for all f at s, an arbitrarily small perturbation δ s of s will produce a topological change. The topological change will correspond to a different number of global stable and unstable manifolds for f at s compared to f at s + δ. 24

Conjecture 2 (Periodic window probability decreasing) Assume f is a mapping (neural network) as previously defined and a bifurcation chain set U as previously defined. In the asymptotic limit of high dimension, the length of the bifurcation chain sets, l = a n a 1, increases such that the cardinality of U m where m is the maximum number of positive Lyapunov exponents for f. In other words, there will exist an interval in parameter space (e.g. s (a 1, a n ) (0.1,4)) where the probability of the existence of a periodic window will go to zero (with respect to Lebesgue measure on the interval) as the dimension becomes large. Or #U as d There exists only a single V (chain link set) 25

Reminder: List of properties for 1-dimensional parameter interval picture The following properties must increase with dimension: a. the number of positive exponents b. the continuity of the exponents relative to parameter change c. asymptotic density of transversal Lyapunov exponent zero crossings 26

Mean max number of positive Lyapunov exponents Mean maximum number of positive LE s versus dimension, all networks have 32 neurons (slope is approximately 1 4 ). Mean maximum number of positive LCEs versus dimension 35 "number_positive_lces" Mean maximum number of LCEs 30 25 20 15 10 5 0 0 20 40 60 80 100 120 140 Dimension 27

num-continuity versus s num-continuity (mean of χ i (s) χ i (s + δs) for each i) versus parameter variation: 32 neurons, 4 (left) and 64 (right) dimensions. 0.2 "d=4" 0.2 "d=64" Mean change in LCE spectrum 0.15 0.1 0.05 Mean change in LCE spectrum 0.15 0.1 0.05 0 0 1 2 3 4 5 6 7 8 9 10 0 0 1 2 3 4 5 6 7 8 9 10 s parameter s parameter 28

num-continuity versus dimension Mean num-continuity, num-continuity of the largest and the most negative Lyapunov exponent of many networks versus their dimension. The error bars are the standard deviation about the mean over the number of networks considered. Mean numerical continuity of LCEs 0.06 0.05 0.04 0.03 0.02 0.01 0 "mean_n-continuity" "mean_lle_n-continuity" "mean_least_le_n-continuity" 0 20 40 60 80 100 120 140 Dimension 29

Intuition for a-density of zero crossings Number of positive LE s for typical individual networks with 32 neurons and 32 (left) and 128 (right) dimensions. Number of positive Lyapunov exponents 9 8 7 6 5 4 3 2 1 0 "d=32" 0 1 2 3 4 5 6 7 8 9 10 Number of positive Lyapunov exponents 35 30 25 20 15 10 5 0 "d=128" 0 1 2 3 4 5 6 7 8 9 10 s parameter s parameter 30

a-density of the first 10 zero crossings Mean distance (δs) between each of the first 10 zero crossings of LE s for many networks with 32 neurons and 16, 32, 64, and 128 dimensions. Mean distance between zero crossing 5 4.5 4 3.5 3 2.5 2 1.5 1 0.5 0 "d=16" "d=32" "d=64" "d=128" 1 2 3 4 5 6 7 8 9 10 Zero crossing 31

Review 1-dimensional parameter interval conjectures a. With arbitrarily large dimension, there will be arbitrarily many positive Lyapunov exponents to provide our needed exponent zero crossings. b. The exponents become more continuous with respect to variation of the s parameter as the dimension of the dynamical system is increased which helps to guarantee smooth, transversal zero crossings of Lyapunov exponents as well as no abrupt topological change. c. The Lyapunov exponent zero crossings become more tightly packed, i.e. the bifurcation chain set is becoming a-dense in it s closure. 32

Alternative argument with positive LCE scaling 0.35 0.3 lce scaling "d=8" "d=16" "d=32" "d=64" "d=128" 0.25 0.2 0.15 0.1 0.05 0 0 20 40 60 80 100 120 33

High-dimensional parameter set generalization Definitions, conjectures, and an outline of evidence. Basic results: General stability of instability, chaos reigns, periodic windows disappear or become unobservable in portions of parameter space. The geometry of the dynamics is not drastically changed upon perturbations of parameters. 34

The abstraction of parameter perturbation Consideration of the map: φ : R N(d+2)+1 Σ(G) (7) Things to be concerned about with respect to φ: continuity, affine structure, differentiability (e.g. is φ C 0, C r r > 1, etc). For now, we will assume that an open ball in R N(d+2)+1 yields an open ball in Σ(G) (however, Σ(G) is infinite dimensional where as R N(d+2)+1 is not). 35

Robustness of a property with respect to a space A property X of some operation or morphism (e.g. a mapping) Y is robust if X is a persistent property in an open neighborhood of Y. To specify robustness I need: the property; the mapping; the open set upon which the property is persistent; 36

k-degree LCE stability Definition 7 (degree k Lyapunov exponent equivalent) Assume two discrete-time maps f and g (networks) as previously defined from a compact set to itself. The mappings f and g are called k Lyapunov equivalent if f and g have the same number of Lyapunov exponents and if f and g have the same number of positive Lyapunov exponents. Definition 8 (Robust chaos of degree k) Assume a discrete time map f (network) as previous defined mapping a compact set to itself. The map f has robust chaos of degree k if there exists a p-dimensional subset U R p, p N, such that, for all ξ U, and for all initial conditions of the map f at ξ, f retains k positive Lyapunov exponents. In other words, f is k-lyapunov equivalent on the subset U. 37

Conjectures Conjecture 3 (Robust chaos with respect to parameter perturbation in R p ) Assume f is a mapping (neural network) as previously defined with a sufficiently high number of dimensions, d, and let p = N(d+2)+1. There will exist a open set V in R p (with significant Lebesgue measure) of parameter space for which chaos will be a robust dynamic. Conjecture 4 (k robust chaos in large dynamical systems) Assume f is a mapping (neural network) as previously defined with a sufficiently high number of dimensions, d, and an arbitrarily high number of parameters (neurons) p = N(d+2)+1. There will exist an open interval (with significant Lebesgue measure) in parameter space (R p ) for which chaos will be robust of degree k with k as d. 38

Conjecture 5 (Periodic window probability diminishing) Assume f is a mapping (neural network) as previously defined with a sufficiently high number of dimensions, d. There will exist a set V R p (again let p = N(d+2)+1) of parameter space such that there will not exist periodic windows on a positive Lebesgue measure set within V. 39

k-degree LCE stability Point: with respect to the distribution of positive exponents, the mean must increase, the variance must not explode. Histograms of the number of positive Lyapunov exponents for d = 8 and d = 64. 5000 2500 4500 4000 2000 Number of networks 3500 3000 2500 2000 1500 Number of networks 1500 1000 1000 500 500 0-1 0 1 2 3 4 5 0 0 2 4 6 8 10 12 14 16 18 Number of positive Lyapunov exponents Number of positive Lyapunov exponents 40

Characterizing k Characterization the degree of k LCE stability: i. the degree of k-degree LCE stability for the ensemble could be defined as the minumum number of positive exponents for an ensemble of perturbed networks. k = 1 at d = 64 at s = 3 ii. the degree of k-degree LCE stability for the ensemble could be defined as the mean number of postive Lyapunov exponents minus the standard deviation about the mean. k = 1 at d = 4, k = 8 for d = 64 at s = 3 iii. the degree of k-degree LCE stability for the ensemble could be defined as the lower boundary of the curve under which 99 percent of the area of the distribution of the number of positive Lyapunov exponents is contained. k = 1 at d = 32, and k = 5 at d = 64 for s = 3 k is increasing with d 41

Example: 64-dimensional network The histogram of the number of positive Lyapunov exponents for a typical 64-dimensional network perturbed 100 times. 45 40 Number of perturbed cases 35 30 25 20 15 10 5 6 8 10 12 14 Number of positive Lyapunov exponents 42

Probability of periodic windows decreases with dimension like 1 d 2 Log probability of the existence of periodic versus log of dimension for 500 cases per d. Each case has all the weights perturbed on the order of 10 3 100 times per case. The line of best is 1 d 2. 5 log base 2 probability of windows 0-5 -10-15 -20-25 1 2 3 4 5 6 7 8 log base 2 dimension 43

Periodic windows summary: Probability that a network has a window is decreasing much more slowly than the probability of observed periodic orbits in networks with observed periodic orbits. As the dimension of a dynamical system is increased, few windows are observed, and those windows are concentrated in networks with many windows. 44

Conclusions with respect to region IV: As the dimension is increased, chaos becomes dynamic type which is robust with respect to parameter change. Topological change, i.e. a change in the number of positive Lyapunov exponents and hence a change in the number of global stable and unstable manifolds, is much less drastic versus parameter variation as the dimension of a dynamical system is increased. Periodic orbits are much less common in high-dimensional, chaotic dynamical systems, and observable periodic orbits concentrated in a smaller set of networks as the dimension of the network is increased. 45

1. Stability of the LCE algorithm. Future and work: Current projects 2. Analysis of region II the routes to chaos region (analytical and numeric). 3. Scaling with respect to the LCE spectrum. 4. Scaling with respect to the number of positive LCEs versus s. 5 Linking Takens embedding theorem to the neural network approximation theorems. Future projects 1. Uniform and non-uniform partial hyperbolicity of high-dimensional neural networks. Compare this with results of both Pesin and Bonatti and Viana regarding SRB measures and Lyapunov exponents. 2. Basins of attractors, existence of Milnor attactors, relation with the finite SRB measure conjecture of Palis and Milnor attractor results of Kaneko; 46

3. A symbolic dynamics, anti-integrable limit study of region V. In other words, a detailed study of the transition from chaos to finite state orbits. 4. Direct connection to nature: train networks on high-dimensional experimental and numerical data sets, study weight distributions. 5. Forever transient a generalized notion of dynamic stability in systems never allowed time to converge to an ergodic-type limit: e.g. timeevolution of weight distribution; 6. Robustness with respect to weight distributions: increases generality; useful for comparison with fitted networks;