IST 4 Information and Logic

Similar documents
IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

IST 4 Information and Logic

Week 3: More Processing, Bits and Bytes. Blinky, Logic Gates, Digital Representations, Huffman Coding. Natural Language and Dialogue Systems Lab

IST 4 Information and Logic

Averaging Points. What s the average of P and Q? v = Q - P. P + 0.5v = P + 0.5(Q P) = 0.5P Q

CNS 188a Computation Theory and Neural Systems. Monday and Wednesday 1:30-3:00 Moore 080

Reification of Boolean Logic

Introduction Biologically Motivated Crude Model Backpropagation

Where does it come from?

Computational Intelligence Winter Term 2009/10

Computational Intelligence

Neural Networks Introduction CIS 32

Artificial Neural Network and Fuzzy Logic

Computational Intelligence

IST 4 Information and Logic

CS 256: Neural Computation Lecture Notes

Building a Computer Adder

cse 311: foundations of computing Spring 2015 Lecture 3: Logic and Boolean algebra

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

Preparing for the CS 173 (A) Fall 2018 Midterm 1

Machine Learning (CS 567) Lecture 3

CMSC 313 Lecture 16 Announcement: no office hours today. Good-bye Assembly Language Programming Overview of second half on Digital Logic DigSim Demo

Neural networks. Chapter 19, Sections 1 5 1

Last lecture Counter design Finite state machine started vending machine example. Today Continue on the vending machine example Moore/Mealy machines

Multi-Dimensional Neural Networks: Unified Theory

Methods of Mathematics

Generalized FSM model: Moore and Mealy

Finite Automata. Warren McCulloch ( ) and Walter Pitts ( )

Neural networks. Chapter 20. Chapter 20 1

CMSC 313 Lecture 17. Focus Groups. Announcement: in-class lab Thu 10/30 Homework 3 Questions Circuits for Addition Midterm Exam returned

CSE 105 Theory of Computation

Ph 1a Fall General Information

CprE 281: Digital Logic

Are Rosenblatt multilayer perceptrons more powerfull than sigmoidal multilayer perceptrons? From a counter example to a general result

18.02 Multivariable Calculus Fall 2007

BOOLEAN ALGEBRA INTRODUCTION SUBSETS

Neural networks. Chapter 20, Section 5 1

Information Storage and Spintronics 02

Systems I: Computer Organization and Architecture

1 Two-Way Deterministic Finite Automata

Computer Science. 19. Combinational Circuits. Computer Science COMPUTER SCIENCE. Section 6.1.

Math Book 20. Multiplication Level 2. Multiplying numbers 7-12

Stat 406: Algorithms for classification and prediction. Lecture 1: Introduction. Kevin Murphy. Mon 7 January,

CE213 Artificial Intelligence Lecture 13

Lecture 7 Artificial neural networks: Supervised learning

Artificial Neural Networks. Historical description

Binary addition example worked out

IST 4 Information and Logic

CSE140: Digital Logic Design Registers and Counters

CSE 20 DISCRETE MATH WINTER

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

CprE 281: Digital Logic

Networks of McCulloch-Pitts Neurons

Logic Design. Chapter 2: Introduction to Logic Circuits

IST 4 Information and Logic

CprE 281: Digital Logic

CMSC 313 Lecture 15 Good-bye Assembly Language Programming Overview of second half on Digital Logic DigSim Demo

Circuits. Lecture 11 Uniform Circuit Complexity

THE MOST IMPORTANT BIT

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

CSC Neural Networks. Perceptron Learning Rule

Boolean Algebra & Digital Logic

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

MAT01A1: Functions and Mathematical Models

THE MULTIPLE-VALUED LOGIC.

PHYSICS 100. Introduction to Physics. Bridges the gap between school science and Physics 101, Physics 120, Physics 125 or Physics 140

Administration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6

Digital Electronics II Mike Brookes Please pick up: Notes from the front desk

Memory Elements I. CS31 Pascal Van Hentenryck. CS031 Lecture 6 Page 1

Simultaneous equations for circuit analysis

2009 Spring CS211 Digital Systems & Lab CHAPTER 2: INTRODUCTION TO LOGIC CIRCUITS

Digital electronics form a class of circuitry where the ability of the electronics to process data is the primary focus.

CprE 281: Digital Logic

Multiple Threshold Neural Logic

ECE/CS 250: Computer Architecture. Basics of Logic Design: Boolean Algebra, Logic Gates. Benjamin Lee

CDS 110b: Lecture 2-1 Linear Quadratic Regulators

y k = (a)synaptic f(x j ) link linear i/p o/p relation (b) Activation link linear i/p o/p relation

Department of Electrical and Computer Engineering University of Wisconsin Madison. Fall Final Examination

EECS 144/244: Fundamental Algorithms for System Modeling, Analysis, and Optimization

CprE 281: Digital Logic

Introduction to Neural Networks

Statistical NLP for the Web

Implementing an Intelligent Error Back Propagation (EBP) Relay in PSCAD TM /EMTDC 4.2.1

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 1: Quantum circuits and the abelian QFT

CN2 1: Introduction. Paul Gribble. Sep 10,

CHEM 115 Lewis Structures Model

ECE 342 Electronic Circuits. Lecture 34 CMOS Logic

HONORS LINEAR ALGEBRA (MATH V 2020) SPRING 2013

Transcription:

IST 4 Information and Logic

T = today x= hw#x out x= hw#x due mon tue wed thr fri 30 M 6 oh M oh 3 oh oh 2M2M 20 oh oh 2 27 oh M2 oh midterms Students MQ oh = office hours Mx= MQx out 4 3 oh 3 4 oh oh oh presentations Mx= MQx due 8 oh oh 4 5 25 T oh 5 oh oh oh

MQs. Everyone has a gift! (Tuesday) 2. Memory (Thursday)

Tuesday, 6/2, 2:30pm. Christopher Haack: The gift of resilience 2. Joon Lee: Settling is not an option 3. Spencer Strumwasser: The gift of dyslexia 4. Richard Zhu: The gift of memory 5. Ah Ashwin Hari: The gift of musical composition 6. Jessica Nassimi: Evolution a gift in disguise 7. Serena Delgadillo: The gift of self-expression 8. Megan Keehan: Gift of motherliness 9. Zane Murphy: Grandmother and the piano

Thursday, 6/4, 2:30pm. Connor Lee: Memory is a fickle thing blessing or curse 2. Pallavi Aggarwal: The wonders of human memory 3. Peter Kundzicz and Anshul Ramachandran: Muscle memories 4. Siva Gangavarapu: A cultural retrospection 5. Philip Liu: The light of other days 6. Jason Simon: Math and Broadway 7. Yujie Xu: Memory v.s. ESL 8. Celia Zhang: When memory sours

Last Lecture Gates and circuits AON: AND, OR, Not LT: Linear Threshold LT: Linear Threshold > > a b c a b > > a b c a >b > > > b c a b c a b c > > > > b c a b c a b c

Last Lecture AON: AND, OR, Not LT: Linear Threshold General construction for symmetric functions AON 5 LT-l 4 LT-nl 2 * Exponential gap in size What are the symmetric functions that can be computed by a single LT gate? * = it is optimal * *

Linear Threshold and SYM

LT: Linear Threshold

Symmetric Functions and LT Circuits Q: Which class has more functions? Q: How is SYM related to LT?? Definitions: () SYM = the class of Boolean symmetric functions (2) LT = the class of Boolean functions that can be (2) LT the class of Boolean functions that can be realized by a single LT gate.

AND, OR, XOR and MAJ are symmetric functions Q: Which h symmetric functions are in LT?? X AND OR XOR MAJ 0 0 0 0 0 0 0 2 0 0 3 LT not LT LT LT LT = the class of Boolean functions that can be realized by a single LT gate.

Definition: A symmetric ti Boolean function is in TH if it has at most a single transition in the symmetric function table = a transition X AND OR XOR MAJ 0 0 0 0 0 0 0 2 0 0 3 In TH Not in TH

The Class TH is in LT X TH0 TH TH2 TH3 TH0 TH TH2 TH3 0 0 0 0 0 0 0 0 0 2 0 0 0 0 3 0 0 0 0

Q: How is TH related to SYM and LT?? We know that: We Proved that: SYM TH LT

TH is exactly in the intersection of SYM and LT Theorem: Proof: Not today... you might want to try and prove it... Q: What are the 4 functions? SYM TH LT???

LT Function that is not Symmetric - - 0 0 0 0 - -2 0-0 0 0

Linear Threshold l Circuits for symmetric functions

AON 5 LT-l 4 LT-nl 2 General construction for symmetric functions

X XOR Q: compute XOR 0 0 2 0 with TH gates? X TH TH2 TH+TH2-0 0 0 2 0 0

LT Depth-2 Circuits TH - + TH2 X TH TH2 TH+TH2-0 0 0 2 0 0

Generalization X f(x) 0 0 2 3 0 4 0

Generalization X f(x) 0 0 2 3 0 4 0

Generalization X f(x) TH 0 0 0 2 3 0 4 0

Generalization X f(x) TH TH3 0 0 0 2 3 0 0 4 0 0

Generalization X f(x) TH TH3 Σ - 0 0 0 0 2 3 0 0 0 4 0 0 0

X f(x) TH TH3 Σ - 0 0 0 0 2 3 0 0 0 4 0 0 0

Generalization to EQ 0 0 n 0 0 0 0 0

Generalization to EQ 0 0 2 0 0

Generalization to SYM - + Q: What is the generalization to arbitrary symmetric functions?

Generalization to SYM Q: What is the generalization to arbitrary symmetric functions? A: Consider the symmetric function table, it is a sum of non-overlapping -intervals 0 0 Sum of two TH functions

Back to XOR n TH gates for XOR of n variables 0 0 2 0 3 4 0 5

LT-l Circuit Design Algorithm for SYM f(x) 0 2 0 3 4 5 0 6 7 Subtract for every isolated -block

The Layered Construction for SYM -Some History Saburo Muroga 925-2009 959 Was born in Japan Majority Decision PhD in 958 from Tokyo U, Japan 960-964: Researcher at IBM Research, NY 964-2002: professor at the University of Illinois, Urbana-Champaign

Saburo Muroga 925-2009 HW#5 problem 2a

neural circuits and logic some more history...

Being Homeless and Interdisciplinary Research Warren McCulloch 899-969 Walter Pitts 923-969 Neurophysiologist, MD Logician, Autodidact Warren McCulloch arrived in early 942 to the University of Chicago, invited Pitts, who was homeless, to live with his family In the evenings McCulloch and Pitts collaborated. Pitts was familiar with the work of Leibniz on computing. They considered the question of whether the nervous system is a kind of universal computing device as described by Leibniz This led to their 943 seminal neural networks paper: A Logical Calculus of Ideas Immanent in Nervous Activity

Impact Warren McCulloch Walter Pitts 899-969 923-969 Neurophysiologist, MD Logician, Autodidact This led to their 943 seminal neural networks paper: p A Logical Calculus of Ideas Immanent in Nervous Activity Neural networks and Logic Time Memory Threshold Logic and Learning State Machines

neural circuits and memory m computing with dynamics

Linear Threshold Some Adjustments Linear Threshold (LT) gate -t threshold -t -

AND Function with {0,} -2 0 0 0-2 - 0 0 0-0 0

AND Function with {-,} The AND function of two variables with {-, }:??? -- - - -3- - - - - +

AND Function with {-,} The AND function of two variables with {-, }:??? -- - - -3 - -

Linear Threshold with Memory Elephants are symbols of wisdom in Asian cultures and are famed for their exceptional memory A memory nose Remembers the last f(x)

Feedback Networks Example - 0-0 weights thresholds Th t t f th t k th t th t d The state of the network: the vector that corresponds to the states (noses ) of the gates

Feedback Networks Example Label the gates 2-0 - 0

Feedback Networks Example 2-0 - 0

Feedback Networks Example 2-0 - - 0

Feedback Networks Example 2 - - - 0-0 - is a stable state

Feedback Networks Example 2-0 - 0

Feedback Networks Example 2-0 - 0

Feedback Networks Example 2-0 - 0 -

Feedback Networks Example - 2-0 - 0 - - is a stable state

Feedback Networks Example 2-0 - 0 state The node that computes State transition diagram (state space) - 2 - -- Q: Is -- a stable state?

Feedback Networks Example - 2-0 - - 0 - Answer: No Q: Is -- a stable state?

Feedback Networks Example - 2-0 - 0 - - 2 - --

Feedback Networks Example 2 - - 0 - - 0 - - 2 - --

Feedback Networks Example 2 - - - 0-0 - 2 2 - --

Feedback Networks Example 2-0 - - 0 stable states - 2 2 - --

neural circuits and memory m associative memory

Feedback Networks Computing with Dynamics stable states - 2 - -- 2 Input: initial state Feedback Network Output: stable state -

Feedback Networks Computing with Dynamics stable states - 2 - -- 2 Input: initial state Feedback Network Output: stable state -

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Input: initial state Feedback Networks Computing with Dynamics Associative Memory The Leibniz-Boole Machine Output: stable state Feedback Network

Who is this person?????

John Hopfield Feedback Networks Hopfield Model (Caltech 982)

John Hopfield Feedback Networks Hopfield Model (Caltech 982) 2-0 - 0 i = node i - 2 0 0-0 = threshold ti = state vi - = weight of edge (i,j)

The matrix description

Feedback Networks The Vector/Matrix Description An n node feedback network can be specified by: W an nxn matrix of weights T an n vector of thresholds V an n vector of states 2 3 4 5

The Matrix Description Example An n node feedback network can be specified by: W an nxn matrix of weights T an n vector of thresholds V an n vector of states - 2 0 0 - - - 2

The Matrix Description Computation Computation ti in N= (W,T) 2 3 4 by column 5

Order of computation serial and parallel

Modes of Operation Q: when do the nodes compute? Serial mode: one node at a time (arbitrary order) - - 2

Modes of Operation Q: when do the nodes compute? Serial mode: one node at a time (arbitrary order) - - 2

Modes of Operation Q: when do the nodes compute? Serial mode: one node at a time (arbitrary order) - - 2 Fully-Parallel mode: all nodes at the same time - - 2

Three examples

Example Serial Mode Symmetric Weight Matrix - - 2 The state space: stable states - 2 - -- 2

Example 2 Fully-Parallel (FP) Mode Symmetric Weight Matrix Q: how does the state space look? start with - - 2 It s a cycle!

Example 2 Fully-Parallel (FP) Mode Symmetric Weight Matrix - - 2 The state space: stable states - - -- cycle of length 2

Example 23 Fully-Parallel Mode Antisymmetric Symmetric Weight Matrix W T = WW - - 2 Q: how does the state space look?

Example 3 Fully-Parallel Mode Antisymmetric Weight Matrix - Q: how does the state space look? 2 cycle of length 4

Example 3 Fully-Parallel Mode Antisymmetric Weight Matrix - 2 The state space: - cycle of length 4 - --

The Three Cases - - - - -- - -- - -- 2 3 Cycle lengths mode W symmetric antisymmetric Example # serial? fully-parallel,2 4 2 3

The Three Cases Cycle lengths Example # mode W symmetric antisymmetric serial? fully-parallel,2 4 2 3 Hopfield 982 2 Goles 985 3 Goles 986

Proof Ideas Cycle lengths W symmetric antisymmetric mode Example # serial? fully-parallel,2 4 2 3 The proofs of these three results use the concept of an energy function For the serial mode: Show that: t Namely, stable states are local max of the energy E

Questions on Convergence Posted on the class web site Cycle lengths mode W symmetric antisymmetric Example # serial? Hopfield 982 fully-parallel,2 4 2 3 2 Goles 985 Q: Are the three cases distinct? 3 Goles 986 Q2: Elementary proof? (wo/energy)