Electrical Engineering and Computer Science Department The University of Toledo Toledo, OH 43606

Size: px
Start display at page:

Download "Electrical Engineering and Computer Science Department The University of Toledo Toledo, OH 43606"

Transcription

1 Gursel Serpen and David L. Livingston Electrical Engineering and Computer Science Department The University of Toledo Toledo, OH ANNIE 98 PRESENTATION

2 THE RESEARCH WORK A higher-order single-layer relaxation-type recurrent neural network is employed to plan a task in a decomposed state space. The finite state machine model of the state space of the complex task is decomposed into parallel/series combinations of component machines, each of which represents a subtask, using lattice theoretic techniques. Planning the complex task is realized by identifying the transfer sequence, an ordered set of inputs, of the component state machines for the subtasks. A fourth-order stochastic optimization neural network, single-layer relaxation-type recurrent network with simulated annealing, is utilized to perform the search needed to specify the transfer sequence in the state space of the complex task. The effect of decomposition on two essential areas is highlighted: significant reduction in computational resources required to implement the neural algorithm and the need to employ a high order neural network are emphasized. ANNIE 98 Presentation Page 2 of 8

3 TOWERS OF HANOI PROBLEM The problem is moving a stack of disks from one peg to another by moving one disk at a time and never stacking a disk on top of a smaller disk. State 0 State 1 State 2 State 3 State 4 State 5 State 6 State 7 State 8 State Space for the Two-Disk Towers of Hanoi Problem. ANNIE 98 Presentation Page 3 of 8

4 STATE SPACE STATE MACHINE LATTICE OF SUBSTITUTION PROPERTY PARTITIONS PARALLEL OR SERIALLY DECOMPOSED STATE MACHINE ANNIE 98 Presentation Page 4 of 8

5 SERIAL DECOMPOSITION Using substitution property partition π, head component machine, { 0, 3, 6; 1, 4, 7; 2, 5, 8} π = S S S S S S S S S and generating a suitable partition τ, tail component machine, with the constraint = 0 where τ = { S0, S1, S2; S3, S4, S5; S6, S7, S8} π τ a serial decomposition of the task can be obtained. The next step is to assign states, which can be done in an arbitrary fashion, to the blocks of the partitions to facilitate the derivation of state tables for the two component machines. π = S τ = S B1, S, S B 4, S, S B2 ; S, S, S B5 ; S, S, S B3 ; S, S, S B6 ; S, S, S input I Machine π (head) Machine τ (tail) Two Component Serial Decomposition Topology ANNIE 98 Presentation Page 5 of 8

6 N-TH ORDER RELAXATION-TYPE BOLTZMANN MACHINE The n-th order Boltzmann machine is a stochastic single-layer recurrent network and will be employed in relaxation mode to search for the shortest path. The performance function of an n-th order Boltzmann machine is defined as E N 1 = w s s s s b N i i in i i in ik ik i1 i2 in ik = 1 where w i1i2 in is an N-dimensional weight matrix symmetric on all pairs of indices, b ik is the external input to and s ik is the output of the computation node i k with k =1 N. Each computation node output is binary valued, zero or one, and the activation function is defined as ( 1) k P s 1 = = 1 + e i net / T where P ik is the probability that s ik, the output of unit i k, is equal to 1, T is a timevarying computational parameter analogous to temperature and net ik is the input sum to unit i k. i k The term net ik is defined by where i { j j j } k net = w s s s i j j j j j j j j j k 1 2 N N N 1 2 N 1, ANNIE 98 Presentation Page 6 of 8

7 FOURTH-ORDER RELAXATION-TYPE BOLTZMANN MACHINE TOPOLOGY For a two-component decomposition, the M N topology can be partitioned into two arrays with dimensions M 1 N and M N where M 2 1 is the number of states in the first component machine and M 2 is the number of states in the second component machine with M1 + M2 < M B 1 H E A D B 2 B 3 T A I L B 4 B 5 B 6 Neural Network Array for a Solution of the Towers of Hanoi Problem ANNIE 98 Presentation Page 7 of 8

8 DERIVATION OF WEIGHT PARAMETER VALUES A serially decomposed state machine will require a fourth order neural network architecture. The general form of the performance function for a fourth-order Boltzmann machine is 1 E = w s s s s s b 2 i j k l ijkl i j k l i i i where w ijkl is the weight between computation nodes si, s j, sk, and sl. The weight term is defined by w ijkl =g αδ α α ijkl where α is the index over the set of constraints and g α R + if the hypothesis all nodes represent for constraint α are mutually supporting and g α R if the hypothesis are conflicting. The term δ α ijkl is equal to 1 if the hypotheses represented by si, s j, sk, and sl are compatible under constraint α and is equal to 0 otherwise. Bounds on constraint weight parameters can defined to establish the complete set of solutions as stable equilibrium points in the state space of the neural network dynamics as follows: g1 + 2g3 g2 and 2g 1 + g 3 g 4 ANNIE 98 Presentation Page 8 of 8

9 CONCLUSIONS We have demonstrated the use of a high order Boltzmann machine for task planning problem in decomposed state spaces. By using the decomposed form of the state machines that are models of the tasks, the number of processing elements in the Boltzmann machine can be reduced at the expense of high order connections. We have demonstrated the need for high order machines in a class of problems, which may be of practical significance to task planning and execution. In the case of task planning in decomposed state spaces, it was observed that a high-order neural network was necessary to implement the compatibility relations between subtasks. It is still an open research question if the proposed task planning neural network will scale well with the large size state spaces. We have employed a procedure to define the values of constraint weight parameters to establish the stability of problem solutions in the state space of the neural network dynamics. Preliminary simulation results indicated that the high order Boltzmann machine converged to a solution after majority of relaxations. Further research is needed to firmly establish the performance of the high order Boltzmann machine performance. ANNIE 98 Presentation Page 9 of 8

DETERMINATION OF PARAMETERS IN RELAXATION- SEARCH NEURAL NETWORKS FOR OPTIMIZATION PROBLEMS

DETERMINATION OF PARAMETERS IN RELAXATION- SEARCH NEURAL NETWORKS FOR OPTIMIZATION PROBLEMS DETERMIATIO OF PARAMETERS I RELAXATIO- SEARCH EURAL ETWORKS FOR OPTIMIZATIO PROBLEMS Gursel Serpen Electrical Engineering & Computer Science Department The University of Toledo, Toledo OH 0 USA David L.

More information

Divide and Conquer. Recurrence Relations

Divide and Conquer. Recurrence Relations Divide and Conquer Recurrence Relations Divide-and-Conquer Strategy: Break up problem into parts. Solve each part recursively. Combine solutions to sub-problems into overall solution. 2 MergeSort Mergesort.

More information

On Computational Limitations of Neural Network Architectures

On Computational Limitations of Neural Network Architectures On Computational Limitations of Neural Network Architectures Achim Hoffmann + 1 In short A powerful method for analyzing the computational abilities of neural networks based on algorithmic information

More information

Recursion: Introduction and Correctness

Recursion: Introduction and Correctness Recursion: Introduction and Correctness CSE21 Winter 2017, Day 7 (B00), Day 4-5 (A00) January 25, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Today s Plan From last time: intersecting sorted lists and

More information

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu Analysis of Algorithm Efficiency Dr. Yingwu Zhu Measure Algorithm Efficiency Time efficiency How fast the algorithm runs; amount of time required to accomplish the task Our focus! Space efficiency Amount

More information

Recursion and Induction

Recursion and Induction Recursion and Induction Themes Recursion Recursive Definitions Recurrence Relations Induction (prove properties of recursive programs and objects defined recursively) Examples Tower of Hanoi Gray Codes

More information

Recursion and Induction

Recursion and Induction Recursion and Induction Themes Recursion Recursive Definitions Recurrence Relations Induction (prove properties of recursive programs and objects defined recursively) Examples Tower of Hanoi Gray Codes

More information

Analysis of Algorithms - Midterm (Solutions)

Analysis of Algorithms - Midterm (Solutions) Analysis of Algorithms - Midterm (Solutions) K Subramani LCSEE, West Virginia University, Morgantown, WV {ksmani@cseewvuedu} 1 Problems 1 Recurrences: Solve the following recurrences exactly or asymototically

More information

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee Algorithm Analysis Recurrence Relation Chung-Ang University, Jaesung Lee Recursion 2 Recursion 3 Recursion in Real-world Fibonacci sequence = + Initial conditions: = 0 and = 1. = + = + = + 0, 1, 1, 2,

More information

Recursion and Induction

Recursion and Induction Recursion and Induction Themes Recursion Recurrence Definitions Recursive Relations Induction (prove properties of recursive programs and objects defined recursively) Examples Tower of Hanoi Gray Codes

More information

Systems analysis. Behaviour, architecture L E C T U R E. Ing. Zuzana Bělinová, Ph.D. Lecture 2. Systems engineering. Veronika Vlčková, Zuzana Bělinová

Systems analysis. Behaviour, architecture L E C T U R E. Ing. Zuzana Bělinová, Ph.D. Lecture 2. Systems engineering. Veronika Vlčková, Zuzana Bělinová L E C T U R E 2 Systems analysis Behaviour, architecture Ing. Zuzana Bělinová, Ph.D. LECTURE OVERVIEW System behaviour Genetic code System architecture BEHAVIOUR Way of achieving goals Set of processes

More information

Learning Objectives

Learning Objectives Learning Objectives Learn about recurrence relations Learn the relationship between sequences and recurrence relations Explore how to solve recurrence relations by iteration Learn about linear homogeneous

More information

Why do we need math in a data structures course?

Why do we need math in a data structures course? Math Review 1 Why do we need math in a data structures course? To nalyze data structures and algorithms Deriving formulae for time and memory requirements Will the solution scale? Quantify the results

More information

Handout 7: Recurrences (Cont d)

Handout 7: Recurrences (Cont d) ENGG 2440B: Discrete Mathematics for Engineers Handout 7: Recurrences (Cont d) 2018 19 First Term Instructor: Anthony Man Cho So October 8, 2018 In the last handout, we studied techniques for solving linear

More information

Part III, Sequences and Series CS131 Mathematics for Computer Scientists II Note 16 RECURRENCES

Part III, Sequences and Series CS131 Mathematics for Computer Scientists II Note 16 RECURRENCES CS131 Part III, Sequences and Series CS131 Mathematics for Computer Scientists II Note 16 RECURRENCES A recurrence is a rule which defines each term of a sequence using the preceding terms. The Fibonacci

More information

1 Ways to Describe a Stochastic Process

1 Ways to Describe a Stochastic Process purdue university cs 59000-nmc networks & matrix computations LECTURE NOTES David F. Gleich September 22, 2011 Scribe Notes: Debbie Perouli 1 Ways to Describe a Stochastic Process We will use the biased

More information

Lecture 12 : Recurrences DRAFT

Lecture 12 : Recurrences DRAFT CS/Math 240: Introduction to Discrete Mathematics 3/1/2011 Lecture 12 : Recurrences Instructor: Dieter van Melkebeek Scribe: Dalibor Zelený DRAFT Last few classes we talked about program correctness. We

More information

Lecture 15. Probabilistic Models on Graph

Lecture 15. Probabilistic Models on Graph Lecture 15. Probabilistic Models on Graph Prof. Alan Yuille Spring 2014 1 Introduction We discuss how to define probabilistic models that use richly structured probability distributions and describe how

More information

Chapter 11. Stochastic Methods Rooted in Statistical Mechanics

Chapter 11. Stochastic Methods Rooted in Statistical Mechanics Chapter 11. Stochastic Methods Rooted in Statistical Mechanics Neural Networks and Learning Machines (Haykin) Lecture Notes on Self-learning Neural Algorithms Byoung-Tak Zhang School of Computer Science

More information

7.1 Basis for Boltzmann machine. 7. Boltzmann machines

7.1 Basis for Boltzmann machine. 7. Boltzmann machines 7. Boltzmann machines this section we will become acquainted with classical Boltzmann machines which can be seen obsolete being rarely applied in neurocomputing. It is interesting, after all, because is

More information

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.

More information

Deep Learning. Convolutional Neural Network (CNNs) Ali Ghodsi. October 30, Slides are partially based on Book in preparation, Deep Learning

Deep Learning. Convolutional Neural Network (CNNs) Ali Ghodsi. October 30, Slides are partially based on Book in preparation, Deep Learning Convolutional Neural Network (CNNs) University of Waterloo October 30, 2015 Slides are partially based on Book in preparation, by Bengio, Goodfellow, and Aaron Courville, 2015 Convolutional Networks Convolutional

More information

The Trees of Hanoi. Joost Engelfriet

The Trees of Hanoi. Joost Engelfriet The Trees of Hanoi Joost Engelfriet Leiden Institute of Advanced Computer Science Leiden University, The Netherlands j.engelfriet@liacs.leidenuniv.nl Abstract The game of the Towers of Hanoi is generalized

More information

1 Sequences and Summation

1 Sequences and Summation 1 Sequences and Summation A sequence is a function whose domain is either all the integers between two given integers or all the integers greater than or equal to a given integer. For example, a m, a m+1,...,

More information

DYNAMIC PLANNING MODEL FOR AGENT'S PREFERENCES SATISFACTION: FIRST RESULTS

DYNAMIC PLANNING MODEL FOR AGENT'S PREFERENCES SATISFACTION: FIRST RESULTS DYNAMIC PLANNING MODEL FOR AGENT'S PREFERENCES SATISFACTION: FIRST RESULTS PAVLOS MORAÏTIS * Decision Support Systems Laboratory, Technical University of Crete University Campus, 73100 Chania, Crete, Greece

More information

Neural Turing Machine. Author: Alex Graves, Greg Wayne, Ivo Danihelka Presented By: Tinghui Wang (Steve)

Neural Turing Machine. Author: Alex Graves, Greg Wayne, Ivo Danihelka Presented By: Tinghui Wang (Steve) Neural Turing Machine Author: Alex Graves, Greg Wayne, Ivo Danihelka Presented By: Tinghui Wang (Steve) Introduction Neural Turning Machine: Couple a Neural Network with external memory resources The combined

More information

Cpt S 223. School of EECS, WSU

Cpt S 223. School of EECS, WSU Math Review 1 Why do we need math in a data structures course? To nalyze data structures and algorithms Deriving formulae for time and memory requirements Will the solution scale? Quantify the results

More information

Linear Recurrence Relations

Linear Recurrence Relations Linear Recurrence Relations Linear Homogeneous Recurrence Relations The Towers of Hanoi According to legend, there is a temple in Hanoi with three posts and 64 gold disks of different sizes. Each disk

More information

Topics in Approximation Algorithms Solution for Homework 3

Topics in Approximation Algorithms Solution for Homework 3 Topics in Approximation Algorithms Solution for Homework 3 Problem 1 We show that any solution {U t } can be modified to satisfy U τ L τ as follows. Suppose U τ L τ, so there is a vertex v U τ but v L

More information

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright Pearson-Addison Wesley. All rights reserved. 4 4.1 Interval Scheduling Interval Scheduling Interval scheduling. Job j starts at s j and finishes

More information

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight)

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight) CSCE 636 Neural Networks Instructor: Yoonsuck Choe Deep Learning What Is Deep Learning? Learning higher level abstractions/representations from data. Motivation: how the brain represents sensory information

More information

Review: Directed Models (Bayes Nets)

Review: Directed Models (Bayes Nets) X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected

More information

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6 Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Neural Networks Week #6 Today Neural Networks A. Modeling B. Fitting C. Deep neural networks Today s material is (adapted)

More information

Introduction to the Renormalization Group

Introduction to the Renormalization Group Introduction to the Renormalization Group Gregory Petropoulos University of Colorado Boulder March 4, 2015 1 / 17 Summary Flavor of Statistical Physics Universality / Critical Exponents Ising Model Renormalization

More information

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28 1 / 28 Neural Networks Mark van Rossum School of Informatics, University of Edinburgh January 15, 2018 2 / 28 Goals: Understand how (recurrent) networks behave Find a way to teach networks to do a certain

More information

Deep Belief Networks are Compact Universal Approximators

Deep Belief Networks are Compact Universal Approximators Deep Belief Networks are Compact Universal Approximators Franck Olivier Ndjakou Njeunje Applied Mathematics and Scientific Computation May 16, 2016 1 / 29 Outline 1 Introduction 2 Preliminaries Universal

More information

CMSC 132, Object-Oriented Programming II Summer Lecture 10:

CMSC 132, Object-Oriented Programming II Summer Lecture 10: CMSC 132, Object-Oriented Programming II Summer 2016 Lecturer: Anwar Mamat Lecture 10: Disclaimer: These notes may be distributed outside this class only with the permission of the Instructor. 10.1 RECURSION

More information

Algorithms. Jordi Planes. Escola Politècnica Superior Universitat de Lleida

Algorithms. Jordi Planes. Escola Politècnica Superior Universitat de Lleida Algorithms Jordi Planes Escola Politècnica Superior Universitat de Lleida 2016 Syllabus What s been done Formal specification Computational Cost Transformation recursion iteration Divide and conquer Sorting

More information

Deep Learning Srihari. Deep Belief Nets. Sargur N. Srihari

Deep Learning Srihari. Deep Belief Nets. Sargur N. Srihari Deep Belief Nets Sargur N. Srihari srihari@cedar.buffalo.edu Topics 1. Boltzmann machines 2. Restricted Boltzmann machines 3. Deep Belief Networks 4. Deep Boltzmann machines 5. Boltzmann machines for continuous

More information

GENERALISED IF-THEN-ELSE OPERATOR FOR COMPACT POLYNOMIAL REPRESENTATION OF MULTI OUTPUT FUNCTIONS

GENERALISED IF-THEN-ELSE OPERATOR FOR COMPACT POLYNOMIAL REPRESENTATION OF MULTI OUTPUT FUNCTIONS GENERALISED IF-THEN-ELSE OPERATOR FOR COMPACT POLYNOMIAL REPRESENTATION OF MULTI OUTPUT FUNCTIONS Ilya Levin, Osnat Keren Tel Aviv University, Bar Ilan University, Israel DSD 2011 Outline Logic functions

More information

CS 2210 Discrete Structures Advanced Counting. Fall 2017 Sukumar Ghosh

CS 2210 Discrete Structures Advanced Counting. Fall 2017 Sukumar Ghosh CS 2210 Discrete Structures Advanced Counting Fall 2017 Sukumar Ghosh Compound Interest A person deposits $10,000 in a savings account that yields 10% interest annually. How much will be there in the account

More information

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017 CMSC CMSC : Lecture Greedy Algorithms for Scheduling Tuesday, Sep 9, 0 Reading: Sects.. and. of KT. (Not covered in DPV.) Interval Scheduling: We continue our discussion of greedy algorithms with a number

More information

Bias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions

Bias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions - Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions Simon Luo The University of Sydney Data61, CSIRO simon.luo@data61.csiro.au Mahito Sugiyama National Institute of

More information

SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION

SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION CHAPTER 5 SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION Alessandro Artale UniBZ - http://www.inf.unibz.it/ artale/ SECTION 5.6 Defining Sequences Recursively Copyright Cengage Learning. All rights reserved.

More information

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a

More information

3 Undirected Graphical Models

3 Undirected Graphical Models Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 3 Undirected Graphical Models In this lecture, we discuss undirected

More information

CSC2100B Data Structures Analysis

CSC2100B Data Structures Analysis CSC2100B Data Structures Analysis Irwin King king@cse.cuhk.edu.hk http://www.cse.cuhk.edu.hk/~king Department of Computer Science & Engineering The Chinese University of Hong Kong Algorithm An algorithm

More information

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight)

Deep Learning. What Is Deep Learning? The Rise of Deep Learning. Long History (in Hind Sight) CSCE 636 Neural Networks Instructor: Yoonsuck Choe Deep Learning What Is Deep Learning? Learning higher level abstractions/representations from data. Motivation: how the brain represents sensory information

More information

A graph contains a set of nodes (vertices) connected by links (edges or arcs)

A graph contains a set of nodes (vertices) connected by links (edges or arcs) BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,

More information

Jong C. Park Computer Science Division, KAIST

Jong C. Park Computer Science Division, KAIST Jong C. Park Computer Science Division, KAIST Today s Topics Introduction Solving Recurrence Relations Discrete Mathematics, 2008 2 Computer Science Division, KAIST Definition A recurrence relation for

More information

CS 161: Design and Analysis of Algorithms

CS 161: Design and Analysis of Algorithms CS 161: Design and Analysis of Algorithms Greedy Algorithms 3: Minimum Spanning Trees/Scheduling Disjoint Sets, continued Analysis of Kruskal s Algorithm Interval Scheduling Disjoint Sets, Continued Each

More information

Incremental Stochastic Gradient Descent

Incremental Stochastic Gradient Descent Incremental Stochastic Gradient Descent Batch mode : gradient descent w=w - η E D [w] over the entire data D E D [w]=1/2σ d (t d -o d ) 2 Incremental mode: gradient descent w=w - η E d [w] over individual

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning PAC Learning and VC Dimension Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE

More information

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

COMP 551 Applied Machine Learning Lecture 14: Neural Networks COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: Ryan Lowe (ryan.lowe@mail.mcgill.ca) Slides mostly by: Class web page: www.cs.mcgill.ca/~hvanho2/comp551 Unless otherwise noted,

More information

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets Neural Networks for Machine Learning Lecture 11a Hopfield Nets Geoffrey Hinton Nitish Srivastava, Kevin Swersky Tijmen Tieleman Abdel-rahman Mohamed Hopfield Nets A Hopfield net is composed of binary threshold

More information

Learning Cellular Automaton Dynamics with Neural Networks

Learning Cellular Automaton Dynamics with Neural Networks Learning Cellular Automaton Dynamics with Neural Networks N H Wulff* and J A Hertz t CONNECT, the Niels Bohr Institute and Nordita Blegdamsvej 17, DK-2100 Copenhagen 0, Denmark Abstract We have trained

More information

Data Structures in Java

Data Structures in Java Data Structures in Java Lecture 20: Algorithm Design Techniques 12/2/2015 Daniel Bauer 1 Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways of

More information

Canonical lossless state-space systems: staircase forms and the Schur algorithm

Canonical lossless state-space systems: staircase forms and the Schur algorithm Canonical lossless state-space systems: staircase forms and the Schur algorithm Ralf L.M. Peeters Bernard Hanzon Martine Olivi Dept. Mathematics School of Mathematical Sciences Projet APICS Universiteit

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

Extracting Provably Correct Rules from Artificial Neural Networks

Extracting Provably Correct Rules from Artificial Neural Networks Extracting Provably Correct Rules from Artificial Neural Networks Sebastian B. Thrun University of Bonn Dept. of Computer Science III Römerstr. 64, D-53 Bonn, Germany E-mail: thrun@cs.uni-bonn.de thrun@cmu.edu

More information

Answers to selected exercises

Answers to selected exercises Answers to selected exercises A First Course in Stochastic Models, Henk C. Tijms 1.1 ( ) 1.2 (a) Let waiting time if passengers already arrived,. Then,, (b) { (c) Long-run fraction for is (d) Let waiting

More information

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks and Boltzmann Machines Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks A Hopfield network is a neural network with a graph G = (U,C) that satisfies

More information

fraction dt (0 < dt 1) from its present value to the goal net value: Net y (s) = Net y (s-1) + dt (GoalNet y (s) - Net y (s-1)) (2)

fraction dt (0 < dt 1) from its present value to the goal net value: Net y (s) = Net y (s-1) + dt (GoalNet y (s) - Net y (s-1)) (2) The Robustness of Relaxation Rates in Constraint Satisfaction Networks D. Randall Wilson Dan Ventura Brian Moncur fonix corporation WilsonR@fonix.com Tony R. Martinez Computer Science Department Brigham

More information

What you learned in Math 28. Rosa C. Orellana

What you learned in Math 28. Rosa C. Orellana What you learned in Math 28 Rosa C. Orellana Chapter 1 - Basic Counting Techniques Sum Principle If we have a partition of a finite set S, then the size of S is the sum of the sizes of the blocks of the

More information

Speaker Representation and Verification Part II. by Vasileios Vasilakakis

Speaker Representation and Verification Part II. by Vasileios Vasilakakis Speaker Representation and Verification Part II by Vasileios Vasilakakis Outline -Approaches of Neural Networks in Speaker/Speech Recognition -Feed-Forward Neural Networks -Training with Back-propagation

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University CS 5321: Advanced Algorithms - Recurrence Ali Ebnenasir Department of Computer Science Michigan Technological University Acknowledgement Eric Torng Moon Jung Chung Charles Ofria Outline Motivating example:

More information

2.6 Complexity Theory for Map-Reduce. Star Joins 2.6. COMPLEXITY THEORY FOR MAP-REDUCE 51

2.6 Complexity Theory for Map-Reduce. Star Joins 2.6. COMPLEXITY THEORY FOR MAP-REDUCE 51 2.6. COMPLEXITY THEORY FOR MAP-REDUCE 51 Star Joins A common structure for data mining of commercial data is the star join. For example, a chain store like Walmart keeps a fact table whose tuples each

More information

Christian Mohr

Christian Mohr Christian Mohr 20.12.2011 Recurrent Networks Networks in which units may have connections to units in the same or preceding layers Also connections to the unit itself possible Already covered: Hopfield

More information

The Origin of Deep Learning. Lili Mou Jan, 2015

The Origin of Deep Learning. Lili Mou Jan, 2015 The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets

More information

Course 395: Machine Learning - Lectures

Course 395: Machine Learning - Lectures Course 395: Machine Learning - Lectures Lecture 1-2: Concept Learning (M. Pantic) Lecture 3-4: Decision Trees & CBC Intro (M. Pantic & S. Petridis) Lecture 5-6: Evaluating Hypotheses (S. Petridis) Lecture

More information

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline CS 5321: Advanced Algorithms Analysis Using Recurrence Ali Ebnenasir Department of Computer Science Michigan Technological University Acknowledgement Eric Torng Moon Jung Chung Charles Ofria Outline Motivating

More information

Artificial Intelligence Hopfield Networks

Artificial Intelligence Hopfield Networks Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization

More information

Optimality of an Algorithm Solving the Bottleneck Tower of Hanoi Problem

Optimality of an Algorithm Solving the Bottleneck Tower of Hanoi Problem 25 Optimality of an Algorithm Solving the Bottleneck Tower of Hanoi Problem YEFIM DINITZ AND SHAY SOLOMON Ben-Gurion University of the Negev Abstract. We study the Bottleneck Tower of Hanoi puzzle posed

More information

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield 3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural

More information

UNSUPERVISED LEARNING

UNSUPERVISED LEARNING UNSUPERVISED LEARNING Topics Layer-wise (unsupervised) pre-training Restricted Boltzmann Machines Auto-encoders LAYER-WISE (UNSUPERVISED) PRE-TRAINING Breakthrough in 2006 Layer-wise (unsupervised) pre-training

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Empirical Risk Minimization, Model Selection, and Model Assessment

Empirical Risk Minimization, Model Selection, and Model Assessment Empirical Risk Minimization, Model Selection, and Model Assessment CS6780 Advanced Machine Learning Spring 2015 Thorsten Joachims Cornell University Reading: Murphy 5.7-5.7.2.4, 6.5-6.5.3.1 Dietterich,

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Kyle Reing University of Southern California April 18, 2018

Kyle Reing University of Southern California April 18, 2018 Renormalization Group and Information Theory Kyle Reing University of Southern California April 18, 2018 Overview Renormalization Group Overview Information Theoretic Preliminaries Real Space Mutual Information

More information

Learning in State-Space Reinforcement Learning CIS 32

Learning in State-Space Reinforcement Learning CIS 32 Learning in State-Space Reinforcement Learning CIS 32 Functionalia Syllabus Updated: MIDTERM and REVIEW moved up one day. MIDTERM: Everything through Evolutionary Agents. HW 2 Out - DUE Sunday before the

More information

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories //5 Neural Netors Associative memory Lecture Associative memories Associative memories The massively parallel models of associative or content associative memory have been developed. Some of these models

More information

Intermediate Math Circles March 11, 2009 Sequences and Series

Intermediate Math Circles March 11, 2009 Sequences and Series 1 University of Waterloo Faculty of Mathematics Centre for Education in Mathematics and Computing Intermediate Math Circles March 11, 009 Sequences and Series Tower of Hanoi The Tower of Hanoi is a game

More information

Lecture 20 : Markov Chains

Lecture 20 : Markov Chains CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

More information

CN2 1: Introduction. Paul Gribble. Sep 10,

CN2 1: Introduction. Paul Gribble. Sep 10, CN2 1: Introduction Paul Gribble http://gribblelab.org Sep 10, 2012 Administrivia Class meets Mondays, 2:00pm - 3:30pm and Thursdays, 11:30am - 1:00pm, in NSC 245A Contact me with any questions or to set

More information

INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING

INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING ERIC SHANG Abstract. This paper provides an introduction to Markov chains and their basic classifications and interesting properties. After establishing

More information

1 Recursive Algorithms

1 Recursive Algorithms 400 lecture note #8 [ 5.6-5.8] Recurrence Relations 1 Recursive Algorithms A recursive algorithm is an algorithm which invokes itself. A recursive algorithm looks at a problem backward -- the solution

More information

Analysis of Algorithms. Randomizing Quicksort

Analysis of Algorithms. Randomizing Quicksort Analysis of Algorithms Randomizing Quicksort Randomizing Quicksort Randomly permute the elements of the input array before sorting OR... modify the PARTITION procedure At each step of the algorithm we

More information

CS Non-recursive and Recursive Algorithm Analysis

CS Non-recursive and Recursive Algorithm Analysis CS483-04 Non-recursive and Recursive Algorithm Analysis Instructor: Fei Li Room 443 ST II Office hours: Tue. & Thur. 4:30pm - 5:30pm or by appointments lifei@cs.gmu.edu with subject: CS483 http://www.cs.gmu.edu/

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Basing Decisions on Sentences in Decision Diagrams

Basing Decisions on Sentences in Decision Diagrams Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence Basing Decisions on Sentences in Decision Diagrams Yexiang Xue Department of Computer Science Cornell University yexiang@cs.cornell.edu

More information

Partial Path Column Generation for the Vehicle Routing Problem with Time Windows

Partial Path Column Generation for the Vehicle Routing Problem with Time Windows Partial Path Column Generation for the Vehicle Routing Problem with Time Windows Bjørn Petersen & Mads Kehlet Jepsen } DIKU Department of Computer Science, University of Copenhagen Universitetsparken 1,

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Lecture 5: Logistic Regression. Neural Networks

Lecture 5: Logistic Regression. Neural Networks Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture

More information

Implementation of a Restricted Boltzmann Machine in a Spiking Neural Network

Implementation of a Restricted Boltzmann Machine in a Spiking Neural Network Implementation of a Restricted Boltzmann Machine in a Spiking Neural Network Srinjoy Das Department of Electrical and Computer Engineering University of California, San Diego srinjoyd@gmail.com Bruno Umbria

More information

Binomial Coefficient Identities/Complements

Binomial Coefficient Identities/Complements Binomial Coefficient Identities/Complements CSE21 Fall 2017, Day 4 Oct 6, 2017 https://sites.google.com/a/eng.ucsd.edu/cse21-fall-2017-miles-jones/ permutation P(n,r) = n(n-1) (n-2) (n-r+1) = Terminology

More information

Mathematical Fundamentals

Mathematical Fundamentals Mathematical Fundamentals Sets Factorials, Logarithms Recursion Summations, Recurrences Proof Techniques: By Contradiction, Induction Estimation Techniques Data Structures 1 Mathematical Fundamentals Sets

More information

CS599 Lecture 1 Introduction To RL

CS599 Lecture 1 Introduction To RL CS599 Lecture 1 Introduction To RL Reinforcement Learning Introduction Learning from rewards Policies Value Functions Rewards Models of the Environment Exploitation vs. Exploration Dynamic Programming

More information

Advanced Counting Techniques. 7.1 Recurrence Relations

Advanced Counting Techniques. 7.1 Recurrence Relations Chapter 7 Advanced Counting Techniques 71 Recurrence Relations We have seen that a recursive definition of a sequence specifies one or more initial terms and a rule for determining subsequent terms from

More information

CHAPTER 4 SOME METHODS OF PROOF

CHAPTER 4 SOME METHODS OF PROOF CHAPTER 4 SOME METHODS OF PROOF In all sciences, general theories usually arise from a number of observations. In the experimental sciences, the validity of the theories can only be tested by carefully

More information