Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks
|
|
- Gertrude Young
- 5 years ago
- Views:
Transcription
1 Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks Peter Tiňo School of Computer Science University of Birmingham, UK Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p./25
2 Motivation (Smith, 995) Self-Organizing Neural Network (SONN) - general methodology for solving 0- assignment problems Wide variety of successful applications - assembly line sequencing, frequency assignment in mobile communications, etc. Softmax renormalization - incorporated into SONN in (Guerrero et al., 2002). Can boost performance dramatically. Softmax function contains a free parameter - temperature T. At critical temperatures - powerful intermittent search for high-quality solutions. No theory of critical temperatures. Trial-and error settings. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.2/25
3 SONN with softmax weight renormalization finite set of items i I = {,2,...,N} to be assigned to items j J = {,2,...,M} so that global cost Q(A) of assignment A : I J is minimized. V (i,j) - Partial cost of assigning i I to j J. The "strength" of assigning i to j - "weight" w i,j (0,). 0- assignment solution - produced by imposing A(i) = j iff j = argmax k I w i,k. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.3/25
4 SONN Algorithm Initialize weights w i,j randomly to small values (around 0.5). Choose an j c J. Calculate partial costs V (i,j c ), i I, of all possible assignments to j c. "Winner" item i(j c ) I - the one that minimizes V (i,j c ). Neighborhood B L (i(j c )) of size L of i(j c ) - L items (nodes) i i(j c ) that yield the smallest partial costs V (i,j c ). Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.4/25
5 SONN Algorithm - cont d Weights from nodes i B L (i(j c )) to j c get strengthened: w i,jc w i,jc + η(i)( w i,jc ), i B L (i(j c )), where η(i) is the neighborhood function η(i) = β exp and k(j c ) = argmax i I V (i,j c ). { V (i(j } c),j c ) V (i,j c ), V (k(j c ),j c ) V (i,j c ) Weights w i = (w i,,w i,2,...,w i,n ) for each input node i I are normalized using softmax w i,j exp(w i,j T ) N k= exp(w i,k T ). Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.5/25
6 SONN strengthen assignment weights 2 j M SoftMax normalization 2 j M w i,j i N V(,j)V(2,j) V(i,j) V(N,j) i N B (i(j)) 2 Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.6/25
7 Weight updates w 3,j S for fixed input j update the assignment weights w 2,j w,j w 2,j w 3,j w,j Softmax renormalization Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.7/25
8 SoftMax renormalization - temperature T w 3,j S for fixed input j update the assignment weights w 2,j w,j w 2,j w 3,j w,j Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.8/25
9 Autonomous dynamics - ISM w 3,j w 3,j S S w 2,j w 2,j w,j w,j High temperature regime Low temperature regime Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.9/25
10 Critical temperatures Renormalization step is crucial for intermittent search by SONN for globally optimal assignment solutions. Symmetry breaking bifurcation of equilibria of the renormalization procedure - temperatures T at which optimal (both in terms of quality and quantity of found solutions) intermittent search takes place. Formulate analytical approximations to the critical temperatures. Good correspondence with experimentally found values. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.0/25
11 Iterative SoftMax (ISM) (N )-dimensional simplex in R N : S N = {w = (w,w 2,...,w N ) T R N w i 0, N w i = }. i= Given a temperature T > 0, the softmax maps R N into S N : where w F(w;T) = (F (w;t),f 2 (w;t),...,f N (w;t)) T, ISM on S N : F i (w;t) = exp( w i T ) N i =,2,...,N. k= exp(w k T ), w(t + ) = F(w(t);T). Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p./25
12 Equilibria of ISM Maximum entropy point w = (N,N,...,N ) is a fixed point of ISM for any T. Except for the maximum entropy fixed point all the other fixed points must have exactly 2 distinct coordinate values. Write the larger of the two fixed-point coordinates as γ (w;t) = αn, α (,N). The richest structure of equilibria is found in the center of S N (around w). The least number of fixed points exist around the vertices, i.e. in the "corner" areas of S N. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.2/25
13 Number of ISM Equilibria Theorem: Fix α (,N) and write γ = αn. Let l min be the smallest natural number greater than (α )/γ. Then, for l {l min,l min +,...,N }, at temperature T e (γ ;N,l) = (α ) [ ( l ln α )], lγ there exist ( N l ) distinct fixed points of ISM, with (N l) coordinates having value γ and l coordinates equal to γ 2 = γ (N l). l Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.3/25
14 Number of ISM Equilibria (N = 0) # possible fixed points gamma Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.4/25
15 Stable ISM Equilibria Theorem: Consider a fixed point w S N of ISM with one of its coordinates equal to N γ <. Then, if T s (γ ) = { Ts,2 (γ ) = γ, if γ [N,/2) T s, (γ ) = 2 γ ( γ ), if γ [/2,). Then, if T > T s (γ ), the fixed point w is stable. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.5/25
16 Unstable ISM Equilibria Theorem: Consider a fixed point w S N of ISM with one of its coordinates equal to N γ <. Let N l be the number of coordinates of value γ. Then if T < T u (γ ;N,l) = γ (2 Nγ ) N l Nl + N Nl, w is not stable. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.6/25
17 stability and existence of ISM fixed points (N=0) stable equilibria potentially saddle type equilibria N = stable equilibria T T N = N =4 N =3 unstable equilibria gamma N=5 N=0 N=5 N=40 N= gamma N - number of coordinates with value γ. Bold line - T s (γ ) Solid normal lines - T u (γ ;N,l) Dashed lines - T e (γ ;N,l) Bifurcation temperature is decreasing with increasing N. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.7/25
18 Approximations to the critical temperature. Expand T e (γ ;N,N ) around γ 0 polynomial T (2) N (γ ). = 0.9 as a second-order 2. Solve T (2) N (γ ) = T s (γ ) for γ. 3. Plug the solution γ (2) back to T s, i.e. calculate T (2) (N) = T s (γ (2) ) A cheaper approximation T () (N) can be obtained using linear expansions. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.8/25
19 Example: N-Queen problem Originally posed in 9th century. A benchmark for constraint satisfaction and assignment problems. Place N queens onto an N N chessboard without attacking each other. J = {,2,...,N} and I = {,2,...,N} index the columns and rows, respectively, of the chessboard. Partial cost V (i,j) evaluates the diagonal and column contributions to the global cost Q of placing a queen on column j of row i. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.9/25
20 Experiments Increasing problem size N. Experimentally found the best regimes for intermittent search: optimal neighborhood size L optimal temperature T Compared with analytically predicted critical temperatures T () (N) and T (2) (N). Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.20/25
21 bifurcation temperatures T B (qdr. approx) T B (lin. approx) T B T * (N queens) 0.2 T max. entropy point stable 0.05 max. entropy point unstable N bold solid and dashed lines - T (2) (N) and T () (N), respectively circles - experimentally found bifurcation temperatures stars - best performing temperatures for intermittent search in the N-queens problems Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.2/25
22 Understanding intermittent search in SONN Equilibria that can be guaranteed to be stable - maximum entropy point w and "one-hot" solutions at the vertexes of the N dimensional simplex. Each "one-hot" solution corresponds to one particular assignment of SONN inputs to SONN outputs. Most powerful intermittent search - close to the critical temperature at which "one-hot" equilibria lose stability. "One-hot" solutions do not exhibit a strong attractive force in the ISM state space. SONN weight updates can easily jump from one assignment to another, occasionally being pulled by the neutralizing power of the stable maximum entropy equilibrium w. Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.22/25
23 Detailed bifurcation structure Bifurcation structure of ISM equilibria (N=0) N = T E (0,4) T E (0,3) T E (0,2) T E (0,) 0.8 gamma N =3 N =2 N = temperature T Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.23/25
24 Positions of ISM Equilibria Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.24/25
25 Stable/unstable manifolds w 3 w 3 w 2 w 2 w w Critical Temperatures for Intermittent Search in Self-Organizing Neural Networks p.25/25
Neural Network Training
Neural Network Training Sargur Srihari Topics in Network Training 0. Neural network parameters Probabilistic problem formulation Specifying the activation and error functions for Regression Binary classification
More informationRenormalization Group analysis of 2D Ising model
Renormalization Group analysis of D Ising model Amir Bar January 7, 013 1 Introduction In this tutorial we will see explicitly how RG can be used to probe the phase diagram of d > 1 systems, focusing as
More informationMulticlass Logistic Regression
Multiclass Logistic Regression Sargur. Srihari University at Buffalo, State University of ew York USA Machine Learning Srihari Topics in Linear Classification using Probabilistic Discriminative Models
More informationMath 354 Transition graphs and subshifts November 26, 2014
Math 54 Transition graphs and subshifts November 6, 04. Transition graphs Let I be a closed interval in the real line. Suppose F : I I is function. Let I 0, I,..., I N be N closed subintervals in I with
More informationMatrix Multiplication
3.2 Matrix Algebra Matrix Multiplication Example Foxboro Stadium has three main concession stands, located behind the south, north and west stands. The top-selling items are peanuts, hot dogs and soda.
More informationStability lectures. Stability of Linear Systems. Stability of Linear Systems. Stability of Continuous Systems. EECE 571M/491M, Spring 2008 Lecture 5
EECE 571M/491M, Spring 2008 Lecture 5 Stability of Continuous Systems http://courses.ece.ubc.ca/491m moishi@ece.ubc.ca Dr. Meeko Oishi Electrical and Computer Engineering University of British Columbia,
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationB5.6 Nonlinear Systems
B5.6 Nonlinear Systems 4. Bifurcations Alain Goriely 2018 Mathematical Institute, University of Oxford Table of contents 1. Local bifurcations for vector fields 1.1 The problem 1.2 The extended centre
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationStability of Dynamical systems
Stability of Dynamical systems Stability Isolated equilibria Classification of Isolated Equilibria Attractor and Repeller Almost linear systems Jacobian Matrix Stability Consider an autonomous system u
More information8 Autonomous first order ODE
8 Autonomous first order ODE There are different ways to approach differential equations. Prior to this lecture we mostly dealt with analytical methods, i.e., with methods that require a formula as a final
More information12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria
12. LOCAL SEARCH gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley h ttp://www.cs.princeton.edu/~wayne/kleinberg-tardos
More informationA Cross Entropy Based Multi-Agent Approach to Traffic Assignment Problems
A Cross Entropy Based Multi-Agent Approach to Traffic Assignment Problems Tai-Yu Ma and Jean-Patrick Lebacque INRETS/GRETIA - 2, Avenue du General-Malleret Joinville, F-94114 Arcueil, France tai-yu.ma@inrets.fr,
More informationThe Multi-Agent Rendezvous Problem - The Asynchronous Case
43rd IEEE Conference on Decision and Control December 14-17, 2004 Atlantis, Paradise Island, Bahamas WeB03.3 The Multi-Agent Rendezvous Problem - The Asynchronous Case J. Lin and A.S. Morse Yale University
More informationRenormalization Group for the Two-Dimensional Ising Model
Chapter 8 Renormalization Group for the Two-Dimensional Ising Model The two-dimensional (2D) Ising model is arguably the most important in statistical physics. This special status is due to Lars Onsager
More informationWorksheet 9. Topics: Taylor series; using Taylor polynomials for approximate computations. Polar coordinates.
ATH 57H Spring 0 Worksheet 9 Topics: Taylor series; using Taylor polynomials for approximate computations. Polar coordinates.. Let f(x) = +x. Find f (00) (0) - the 00th derivative of f at point x = 0.
More informationNeural Network Language Modeling
Neural Network Language Modeling Instructor: Wei Xu Ohio State University CSE 5525 Many slides from Marek Rei, Philipp Koehn and Noah Smith Course Project Sign up your course project In-class presentation
More informationTotally Corrective Boosting Algorithms that Maximize the Margin
Totally Corrective Boosting Algorithms that Maximize the Margin Manfred K. Warmuth 1 Jun Liao 1 Gunnar Rätsch 2 1 University of California, Santa Cruz 2 Friedrich Miescher Laboratory, Tübingen, Germany
More informationSolutions for B8b (Nonlinear Systems) Fake Past Exam (TT 10)
Solutions for B8b (Nonlinear Systems) Fake Past Exam (TT 10) Mason A. Porter 15/05/2010 1 Question 1 i. (6 points) Define a saddle-node bifurcation and show that the first order system dx dt = r x e x
More informationWhy is Deep Learning so effective?
Ma191b Winter 2017 Geometry of Neuroscience The unreasonable effectiveness of deep learning This lecture is based entirely on the paper: Reference: Henry W. Lin and Max Tegmark, Why does deep and cheap
More informationNeural Networks: Backpropagation
Neural Networks: Backpropagation Seung-Hoon Na 1 1 Department of Computer Science Chonbuk National University 2018.10.25 eung-hoon Na (Chonbuk National University) Neural Networks: Backpropagation 2018.10.25
More informationDiscrete time dynamical systems (Review of rst part of Math 361, Winter 2001)
Discrete time dynamical systems (Review of rst part of Math 36, Winter 2) Basic problem: x (t);; dynamic variables (e.g. population size of age class i at time t); dynamics given by a set of n equations
More informationStochastic Networks Variations of the Hopfield model
4 Stochastic Networks 4. Variations of the Hopfield model In the previous chapter we showed that Hopfield networks can be used to provide solutions to combinatorial problems that can be expressed as the
More informationStat 451 Lecture Notes Numerical Integration
Stat 451 Lecture Notes 03 12 Numerical Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 5 in Givens & Hoeting, and Chapters 4 & 18 of Lange 2 Updated: February 11, 2016 1 / 29
More informationSolution to Homework #4 Roy Malka
1. Show that the map: Solution to Homework #4 Roy Malka F µ : x n+1 = x n + µ x 2 n x n R (1) undergoes a saddle-node bifurcation at (x, µ) = (0, 0); show that for µ < 0 it has no fixed points whereas
More informationInfluence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations
Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations Robert Kozma rkozma@memphis.edu Computational Neurodynamics Laboratory, Department of Computer Science 373 Dunn
More informationCANARDS AND HORSESHOES IN THE FORCED VAN DER POL EQUATION
CANARDS AND HORSESHOES IN THE FORCED VAN DER POL EQUATION WARREN WECKESSER Department of Mathematics Colgate University Hamilton, NY 3346 E-mail: wweckesser@mail.colgate.edu Cartwright and Littlewood discovered
More informationMathematical Foundations of Neuroscience - Lecture 7. Bifurcations II.
Mathematical Foundations of Neuroscience - Lecture 7. Bifurcations II. Filip Piękniewski Faculty of Mathematics and Computer Science, Nicolaus Copernicus University, Toruń, Poland Winter 2009/2010 Filip
More information= F ( x; µ) (1) where x is a 2-dimensional vector, µ is a parameter, and F :
1 Bifurcations Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Tallahassee, Florida 32306 A bifurcation is a qualitative change
More informationarxiv: v1 [math.ds] 27 Nov 2012
COMPUTATION OF BALANCED EQUIVALENCE RELATIONS AND THEIR LATTICE FOR A COUPLED CELL NETWORK HIROKO KAMEI AND PETER J A COCK arxiv:26334v [mathds] 27 Nov 202 Abstract A coupled cell network describes interacting
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationMachine Learning. 7. Logistic and Linear Regression
Sapienza University of Rome, Italy - Machine Learning (27/28) University of Rome La Sapienza Master in Artificial Intelligence and Robotics Machine Learning 7. Logistic and Linear Regression Luca Iocchi,
More informationChapter 8 Dynamic Programming
Chapter 8 Dynamic Programming Copyright 007 Pearson Addison-Wesley. All rights reserved. Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by
More information8.1 Bifurcations of Equilibria
1 81 Bifurcations of Equilibria Bifurcation theory studies qualitative changes in solutions as a parameter varies In general one could study the bifurcation theory of ODEs PDEs integro-differential equations
More information1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal
. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal 3 9 matrix D such that A = P DP, for A =. 3 4 3 (a) P = 4, D =. 3 (b) P = 4, D =. (c) P = 4 8 4, D =. 3 (d) P
More informationChapter 8 Dynamic Programming
Chapter 8 Dynamic Programming Copyright 2007 Pearson Addison-Wesley. All rights reserved. Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by
More informationQ(x n+1,..., x m ) in n independent variables
Cluster algebras of geometric type (Fomin / Zelevinsky) m n positive integers ambient field F of rational functions over Q(x n+1,..., x m ) in n independent variables Definition: A seed in F is a pair
More informationBenchmarking the Hartree-Fock and Hartree-Fock-Bogoliubov approximations to level densities. G.F. Bertsch, Y. Alhassid, C.N. Gilbreth, and H.
Benchmarking the Hartree-Fock and Hartree-Fock-Bogoliubov approximations to level densities G.F. Bertsch, Y. Alhassid, C.N. Gilbreth, and H. Nakada 5th Workshop on Nuclear Level Density and Gamma Strength,
More informationOne Dimensional Dynamical Systems
16 CHAPTER 2 One Dimensional Dynamical Systems We begin by analyzing some dynamical systems with one-dimensional phase spaces, and in particular their bifurcations. All equations in this Chapter are scalar
More informationSP-CNN: A Scalable and Programmable CNN-based Accelerator. Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay
SP-CNN: A Scalable and Programmable CNN-based Accelerator Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay Motivation Power is a first-order design constraint, especially for embedded devices. Certain
More informationDefining Equations for Bifurcations and Singularities
Defining Equations for Bifurcations and Singularities John Guckenheimer and Yongjian Xiang Mathematics Department, Ithaca, NY 14853 For Vladimir Arnold on the occasion of his 65th birthday July 1, 2002
More informationVasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks
C.M. Bishop s PRML: Chapter 5; Neural Networks Introduction The aim is, as before, to find useful decompositions of the target variable; t(x) = y(x, w) + ɛ(x) (3.7) t(x n ) and x n are the observations,
More informationLecture 5: The Principle of Deferred Decisions. Chernoff Bounds
Randomized Algorithms Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized
More informationNon-linear Dimensionality Reduction
Non-linear Dimensionality Reduction CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Laplacian Eigenmaps Locally Linear Embedding (LLE)
More informationCSC321 Lecture 4: Learning a Classifier
CSC321 Lecture 4: Learning a Classifier Roger Grosse Roger Grosse CSC321 Lecture 4: Learning a Classifier 1 / 31 Overview Last time: binary classification, perceptron algorithm Limitations of the perceptron
More informationONE DIMENSIONAL FLOWS. Lecture 3: Bifurcations
ONE DIMENSIONAL FLOWS Lecture 3: Bifurcations 3. Bifurcations Here we show that, although the dynamics of one-dimensional systems is very limited [all solutions either settle down to a steady equilibrium
More informationME 680- Spring Geometrical Analysis of 1-D Dynamical Systems
ME 680- Spring 2014 Geometrical Analysis of 1-D Dynamical Systems 1 Geometrical Analysis of 1-D Dynamical Systems Logistic equation: n = rn(1 n) velocity function Equilibria or fied points : initial conditions
More informationLecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems
University of Warwick, EC9A0 Maths for Economists Peter J. Hammond 1 of 45 Lecture Notes 6: Dynamic Equations Part C: Linear Difference Equation Systems Peter J. Hammond latest revision 2017 September
More informationNeural Networks: Optimization & Regularization
Neural Networks: Optimization & Regularization Shan-Hung Wu shwu@cs.nthu.edu.tw Department of Computer Science, National Tsing Hua University, Taiwan Machine Learning Shan-Hung Wu (CS, NTHU) NN Opt & Reg
More information4 Second-Order Systems
4 Second-Order Systems Second-order autonomous systems occupy an important place in the study of nonlinear systems because solution trajectories can be represented in the plane. This allows for easy visualization
More informationQuick Tour of Linear Algebra and Graph Theory
Quick Tour of Linear Algebra and Graph Theory CS224W: Social and Information Network Analysis Fall 2014 David Hallac Based on Peter Lofgren, Yu Wayne Wu, and Borja Pelato s previous versions Matrices and
More information7 Planar systems of linear ODE
7 Planar systems of linear ODE Here I restrict my attention to a very special class of autonomous ODE: linear ODE with constant coefficients This is arguably the only class of ODE for which explicit solution
More informationComputational Linear Algebra
Computational Linear Algebra PD Dr. rer. nat. habil. Ralf Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2017/18 Part 2: Direct Methods PD Dr.
More informationCollisional Excitation and N-Level Atoms.
Collisional Excitation and N-Level Atoms. 1 Collisional Excitation & Deexcitation Consider an atom or ion with a lower energy level 1 and an upper level. Collision of a free electron with kinetic energy
More informationHere are some additional properties of the determinant function.
List of properties Here are some additional properties of the determinant function. Prop Throughout let A, B M nn. 1 If A = (a ij ) is upper triangular then det(a) = a 11 a 22... a nn. 2 If a row or column
More informationCS 4700: Foundations of Artificial Intelligence Ungraded Homework Solutions
CS 4700: Foundations of Artificial Intelligence Ungraded Homework Solutions 1. Neural Networks: a. There are 2 2n distinct Boolean functions over n inputs. Thus there are 16 distinct Boolean functions
More informationPHYSICS 210A : STATISTICAL PHYSICS HW ASSIGNMENT
PHYSICS 210A : STATISTICAL PHYSICS HW ASSIGNMENT #1 (1) Consider a system with K possible states i, with i {1,,K}, where the transition rate W ij between any two states is the same, with W ij = γ > 0 (a)
More informationA brief introduction to the inverse Ising problem and some algorithms to solve it
A brief introduction to the inverse Ising problem and some algorithms to solve it Federico Ricci-Tersenghi Physics Department Sapienza University, Roma original results in collaboration with Jack Raymond
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationDFA to Regular Expressions
DFA to Regular Expressions Proposition: If L is regular then there is a regular expression r such that L = L(r). Proof Idea: Let M = (Q,Σ, δ, q 1, F) be a DFA recognizing L, with Q = {q 1,q 2,...q n },
More informationRecurrent Neural Networks with Flexible Gates using Kernel Activation Functions
2018 IEEE International Workshop on Machine Learning for Signal Processing (MLSP 18) Recurrent Neural Networks with Flexible Gates using Kernel Activation Functions Authors: S. Scardapane, S. Van Vaerenbergh,
More informationword2vec Parameter Learning Explained
word2vec Parameter Learning Explained Xin Rong ronxin@umich.edu Abstract The word2vec model and application by Mikolov et al. have attracted a great amount of attention in recent two years. The vector
More informationarxiv: v1 [cs.sc] 17 Apr 2013
EFFICIENT CALCULATION OF DETERMINANTS OF SYMBOLIC MATRICES WITH MANY VARIABLES TANYA KHOVANOVA 1 AND ZIV SCULLY 2 arxiv:1304.4691v1 [cs.sc] 17 Apr 2013 Abstract. Efficient matrix determinant calculations
More informationOn-line Variance Minimization
On-line Variance Minimization Manfred Warmuth Dima Kuzmin University of California - Santa Cruz 19th Annual Conference on Learning Theory M. Warmuth, D. Kuzmin (UCSC) On-line Variance Minimization COLT06
More informationmin 4x 1 5x 2 + 3x 3 s.t. x 1 + 2x 2 + x 3 = 10 x 1 x 2 6 x 1 + 3x 2 + x 3 14
The exam is three hours long and consists of 4 exercises. The exam is graded on a scale 0-25 points, and the points assigned to each question are indicated in parenthesis within the text. If necessary,
More informationLecture 10: Dimension Reduction Techniques
Lecture 10: Dimension Reduction Techniques Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD April 17, 2018 Input Data It is assumed that there is a set
More informationProblem Set Number 2, j/2.036j MIT (Fall 2014)
Problem Set Number 2, 18.385j/2.036j MIT (Fall 2014) Rodolfo R. Rosales (MIT, Math. Dept.,Cambridge, MA 02139) Due Mon., September 29, 2014. 1 Inverse function problem #01. Statement: Inverse function
More informationLinearly-solvable Markov decision problems
Advances in Neural Information Processing Systems 2 Linearly-solvable Markov decision problems Emanuel Todorov Department of Cognitive Science University of California San Diego todorov@cogsci.ucsd.edu
More informationConstructing a chaotic system with any number of equilibria
Nonlinear Dyn (2013) 71:429 436 DOI 10.1007/s11071-012-0669-7 ORIGINAL PAPER Constructing a chaotic system with any number of equilibria Xiong Wang Guanrong Chen Received: 9 June 2012 / Accepted: 29 October
More information1. The growth of a cancerous tumor can be modeled by the Gompertz Law: dn. = an ln ( )
1. The growth of a cancerous tumor can be modeled by the Gompertz Law: ( ) dn b = an ln, (1) dt N where N measures the size of the tumor. (a) Interpret the parameters a and b (both non-negative) biologically.
More informationLocal Phase Portrait of Nonlinear Systems Near Equilibria
Local Phase Portrait of Nonlinear Sstems Near Equilibria [1] Consider 1 = 6 1 1 3 1, = 3 1. ( ) (a) Find all equilibrium solutions of the sstem ( ). (b) For each equilibrium point, give the linear approimating
More informationLecture 5: Linear models for classification. Logistic regression. Gradient Descent. Second-order methods.
Lecture 5: Linear models for classification. Logistic regression. Gradient Descent. Second-order methods. Linear models for classification Logistic regression Gradient descent and second-order methods
More informationChapter 3. Gumowski-Mira Map. 3.1 Introduction
Chapter 3 Gumowski-Mira Map 3.1 Introduction Non linear recurrence relations model many real world systems and help in analysing their possible asymptotic behaviour as the parameters are varied [17]. Here
More informationIntroduction to Search Engine Technology Introduction to Link Structure Analysis. Ronny Lempel Yahoo Labs, Haifa
Introduction to Search Engine Technology Introduction to Link Structure Analysis Ronny Lempel Yahoo Labs, Haifa Outline Anchor-text indexing Mathematical Background Motivation for link structure analysis
More informationPhys 201. Matrices and Determinants
Phys 201 Matrices and Determinants 1 1.1 Matrices 1.2 Operations of matrices 1.3 Types of matrices 1.4 Properties of matrices 1.5 Determinants 1.6 Inverse of a 3 3 matrix 2 1.1 Matrices A 2 3 7 =! " 1
More informationExponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger
Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm by Korbinian Schwinger Overview Exponential Family Maximum Likelihood The EM Algorithm Gaussian Mixture Models Exponential
More informationData Mining and Analysis: Fundamental Concepts and Algorithms
: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer
More informationLecture notes for Analysis of Algorithms : Markov decision processes
Lecture notes for Analysis of Algorithms : Markov decision processes Lecturer: Thomas Dueholm Hansen June 6, 013 Abstract We give an introduction to infinite-horizon Markov decision processes (MDPs) with
More informationReview from Bootcamp: Linear Algebra
Review from Bootcamp: Linear Algebra D. Alex Hughes October 27, 2014 1 Properties of Estimators 2 Linear Algebra Addition and Subtraction Transpose Multiplication Cross Product Trace 3 Special Matrices
More information+ i. cos(t) + 2 sin(t) + c 2.
MATH HOMEWORK #7 PART A SOLUTIONS Problem 7.6.. Consider the system x = 5 x. a Express the general solution of the given system of equations in terms of realvalued functions. b Draw a direction field,
More informationSEMELPAROUS PERIODICAL INSECTS
SEMELPROUS PERIODICL INSECTS BRENDN FRY Honors Thesis Spring 2008 Chapter Introduction. History Periodical species are those whose life cycle has a fixed length of n years, and whose adults do not appear
More informationSolving the N-Queens Puzzle with P Systems
Solving the N-Queens Puzzle with P Systems Miguel A. Gutiérrez-Naranjo, Miguel A. Martínez-del-Amor, Ignacio Pérez-Hurtado, Mario J. Pérez-Jiménez Research Group on Natural Computing Department of Computer
More informationClustering compiled by Alvin Wan from Professor Benjamin Recht s lecture, Samaneh s discussion
Clustering compiled by Alvin Wan from Professor Benjamin Recht s lecture, Samaneh s discussion 1 Overview With clustering, we have several key motivations: archetypes (factor analysis) segmentation hierarchy
More informationScalable Sparsification for Efficient Decision Making Under Uncertainty in High Dimensional State Spaces
Scalable Sparsification for Efficient Decision Making Under Uncertainty in High Dimensional State Spaces IROS 2017 KHEN ELIMELECH ROBOTICS AND AUTONOMOUS SYSTEMS PROGRAM VADIM INDELMAN DEPARTMENT OF AEROSPACE
More informationLECTURE NOTE #11 PROF. ALAN YUILLE
LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform
More informationAre You The Beatles Or The Rolling Stones?
N. Ortelli, G. Lederrey, A. Alahi, M. Bierlaire Transport and Mobility Laboratory School of Architecture, Civil, and Environmental Engineering ENAC École Polytechnique Fédérale de Lausanne 13th Workshop
More informationHalf of Final Exam Name: Practice Problems October 28, 2014
Math 54. Treibergs Half of Final Exam Name: Practice Problems October 28, 24 Half of the final will be over material since the last midterm exam, such as the practice problems given here. The other half
More informationStable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems
Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems Thore Graepel and Nicol N. Schraudolph Institute of Computational Science ETH Zürich, Switzerland {graepel,schraudo}@inf.ethz.ch
More informationMulticlass Classification-1
CS 446 Machine Learning Fall 2016 Oct 27, 2016 Multiclass Classification Professor: Dan Roth Scribe: C. Cheng Overview Binary to multiclass Multiclass SVM Constraint classification 1 Introduction Multiclass
More information4.5 Simplex method. LP in standard form: min z = c T x s.t. Ax = b
4.5 Simplex method LP in standard form: min z = c T x s.t. Ax = b x 0 George Dantzig (1914-2005) Examine a sequence of basic feasible solutions with non increasing objective function values until an optimal
More informationSolutions to Dynamical Systems 2010 exam. Each question is worth 25 marks.
Solutions to Dynamical Systems exam Each question is worth marks [Unseen] Consider the following st order differential equation: dy dt Xy yy 4 a Find and classify all the fixed points of Hence draw the
More informationDynamical Systems and Chaos Part I: Theoretical Techniques. Lecture 4: Discrete systems + Chaos. Ilya Potapov Mathematics Department, TUT Room TD325
Dynamical Systems and Chaos Part I: Theoretical Techniques Lecture 4: Discrete systems + Chaos Ilya Potapov Mathematics Department, TUT Room TD325 Discrete maps x n+1 = f(x n ) Discrete time steps. x 0
More informationTopological scaling and gap filling at crisis
PHYSICAL REVIEW E VOLUME 61, NUMBER 5 MAY 2000 Topological scaling and gap filling at crisis K. Gábor Szabó, 1,2 Ying-Cheng Lai, 3 Tamás Tél, 2 and Celso Grebogi 4 1 Department of Mathematics, University
More informationComputational Linear Algebra
Computational Linear Algebra PD Dr. rer. nat. habil. Ralf Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2017/18 Part 3: Iterative Methods PD
More informationChapter 11. Matrix Algorithms and Graph Partitioning. M. E. J. Newman. June 10, M. E. J. Newman Chapter 11 June 10, / 43
Chapter 11 Matrix Algorithms and Graph Partitioning M. E. J. Newman June 10, 2016 M. E. J. Newman Chapter 11 June 10, 2016 1 / 43 Table of Contents 1 Eigenvalue and Eigenvector Eigenvector Centrality The
More information/639 Final Solutions, Part a) Equating the electrochemical potentials of H + and X on outside and inside: = RT ln H in
580.439/639 Final Solutions, 2014 Question 1 Part a) Equating the electrochemical potentials of H + and X on outside and inside: RT ln H out + zf 0 + RT ln X out = RT ln H in F 60 + RT ln X in 60 mv =
More informationAPPLIED SYMBOLIC DYNAMICS AND CHAOS
DIRECTIONS IN CHAOS VOL. 7 APPLIED SYMBOLIC DYNAMICS AND CHAOS Bai-Lin Hao Wei-Mou Zheng The Institute of Theoretical Physics Academia Sinica, China Vfö World Scientific wl Singapore Sinaaoore NewJersev
More informationABC-LogitBoost for Multi-Class Classification
Ping Li, Cornell University ABC-Boost BTRY 6520 Fall 2012 1 ABC-LogitBoost for Multi-Class Classification Ping Li Department of Statistical Science Cornell University 2 4 6 8 10 12 14 16 2 4 6 8 10 12
More informationManifold Learning: Theory and Applications to HRI
Manifold Learning: Theory and Applications to HRI Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr August 19, 2008 1 / 46 Greek Philosopher
More informationSynchronization Transitions in Complex Networks
Synchronization Transitions in Complex Networks Y. Moreno 1,2,3 1 Institute for Biocomputation and Physics of Complex Systems (BIFI) University of Zaragoza, Zaragoza 50018, Spain 2 Department of Theoretical
More information