logic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte

Size: px
Start display at page:

Download "logic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte"

Transcription

1 Simple Connectionist Systems Steffen Hölldobler logika je všude International Center for Computational Logic Technische Universität Dresden Germany logic is everywhere Connectionist Networks la lógica está por todas partes Connections Activation Functions Output Functions Units Logika ada di mana-mana Updates Winner-Take-All Networks lógica está em toda parte la logique est partout Hikmat har Jaga Hai logika je svuda Mantık her yerde la logica è dappertutto Logik ist überall Logica este peste tot Simple Connectionist Systems 1

2 Connectionist Networks A connectionist network consists of a finite set U of units and a finite set of connections W U U. Each connection is labelled by a weight (W R). A unit consists of an input (vector) (i 1,..., i m ) with i j R for all 1 j m, an activation function φ : R m R, a potential p = φ(i 1,..., i m ), an output function ψ : R R, and an (output) value v = ψ(p). Sometimes a linear time t N is added. Let U = {u i 1 i n} be a connectionist network. The state of U at time t is (v 1 (t),..., v n (t)), where v i (t) is the output of unit u i at time t for all 1 i n. Simple Connectionist Systems 2

3 Example u 1 v 1 u k u j. w k1 i k1 v j wkj w kj v j = i kj p k = Φ k (i k ) v k = Ψ k (p k ). i km um vm w km Simple Connectionist Systems 3

4 Connections Directed and weighted connection from u j to u k with weight w kj R generating input i k = w kj v j. Higher order connection from units u j1,..., u j m to unit u k with weight w kj1...jm R (wrt some ordering j 1 <... < j m ) Q m generating input i k = w kj1...jm l=1 v j l. A undirected and weighted connection between u j and u k consists of a directed and weighted connection from u j to u k and a directed and weighted connection from u k to u j, where w kj = w kj. Simple Connectionist Systems 4

5 Activation Functions Let u k U be a unit with inputs (i 1,..., i m ). Weighted sum activation function φ(i 1,..., i m, t + 1) = Sigma-pi activation function mx i j (t), where i j = w kj v j (t), j=1 φ(i 1,..., i m, t + 1) = mx i j (t), where i j = w kj1...jm j j=1 m j Y l=1 v jl (t). Simple Connectionist Systems 5

6 Some Output Functions Let u k U be a unit with potential p k. Let θ k R; θ k is called threshold. Binary threshold output function Ψ(p k ) = j 1 if pk θ k 0 otherwise. Binary bipolar threshold output function Ψ(p k, t) = j 1 1 if pk θ k otherwise. Linear output function: Φ(p k ) is linear. Simple Connectionist Systems 6

7 More Output Functions Let u k U be a unit with potential p k. Let θ k R be a threshold and β > 0 a steepness parameter. A function is a squashing function if it is non-constant, bounded, monotone increasing and continuous Examples Sigmoidal output function Ψ(p k ) = e β(p k θ k ). Bipolar Sigmoidal or hyperbolic tangent output function Ψ(p k ) = e 2p k 1 = tanh(p k). Simple Connectionist Systems 7

8 Threshold Units Binary threshold unit activation function: weighted sum, output function: binary threshold. Bipolar binary threshold unit activation function: weighted sum, output function: bipolar binary threshold. Sigma-pi unit activation function: sigma-pi, output function: binary threshold. A binary threshold unit is said to be active if its output is 1; it is said to be passive if its output is 0. A binary threshold unit flips if its output changes from 1 to 0 or from 0 to 1 between two consecutive time points; Flips for bipolar binary threshold units are defined likewise. Similar notions can be defined for bipolar threshold units. Simple Connectionist Systems 8

9 Squashing Units Sigmoidal unit activation function: weighted sum, output function: sigmoidal. Bipolar sigmoidal unit activation function: weighted sum, output function: bipolar sigmoidal. Simple Connectionist Systems 9

10 Linear Units Linear unit activation function: weighted sum, output function: linear. Simple Connectionist Systems 10

11 Input and Output Units Connectionist networks may be embedded in an environment. In this case, some of its units may receive additional input from the environment. These units are called input units. Inputs from the environment are real numbers. They are simply added to the input vector of input units, i.e., the input vector of an input unit consists of inputs received from other units and inputs received from the environment. If connectionist networks are embedded in an environment, then the external activation must be specified accordingly for each input unit. Likewise, output units are defined as units which send their output to the environment. Simple Connectionist Systems 11

12 Updates If a unit is updated then its potential and value are computed given the current inputs. In networks with linear time, updates yield potentials and values at t + 1 given inputs at time t. We will consider networks where either all units are updated synchronously or units are updated asynchronously. Simple Connectionist Systems 12

13 Example Consider the following connectionist network, where i 1 (t) = i 2 (t) = j 6 if t = 0 2 otherwise j 5 if t = 0 2 otherwise v j (t) = round(p j (t)) (j = 1, 2, 3, 4) j 0 if t = 0 p j (t) = p j (t 1) + P 4 k=1 w jkv k (t 1) otherwise j ff 0 if t = 0 p j (t) = (j = 1, 2) i j (t 1) otherwise i 2 i 1 u 2 u 1 ff 1 1 u 4 u (j = 3, 4) v 4 v 3 w 32 = w 41 = w 33 = w 44 = 0; all other weights are given in the figure. What happens if the network is synchronously updated? Simple Connectionist Systems 13

14 Winner-Take-All Networks A winner-take-all network is a synchronously updated connectionist network of n units such that after each unit receives an initial input at t = 0 (all other inputs are 0) eventually only the unit with the highest initial input outputs a value greater than 0 whereas the value of all other units is 0. Exercise Construct a winner-take-all network of 3 units (ignoring input units). Simple Connectionist Systems 14

logic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte

logic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte Steffen Hölldobler logika je všude logic is everywhere la lógica está por todas partes Logika ada di mana-mana lógica está em toda parte la logique est partout Hikmat har Jaga Hai Human Reasoning, Logic

More information

logic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte Non-Monotonic Reasoning

logic is everywhere Logik ist überall Hikmat har Jaga Hai Mantık her yerde la logica è dappertutto lógica está em toda parte Non-Monotonic Reasoning Non-Monotonic Reasoning 1 Introduction logika je všude 2 Closed World Assumption 3 Completion logic is everywhere 4 Circumscription la lógica está por todas partes 5 Default Logic 6 Answer Set Programming

More information

The Core Method. The Very Idea. The Propositional CORE Method. Human Reasoning

The Core Method. The Very Idea. The Propositional CORE Method. Human Reasoning The Core Method International Center for Computational Logic Technische Universität Dresden Germany The Very Idea The Propositional CORE Method Human Reasoning The Core Method 1 The Very Idea Various semantics

More information

Skeptical Abduction: A Connectionist Network

Skeptical Abduction: A Connectionist Network Skeptical Abduction: A Connectionist Network Technische Universität Dresden Luis Palacios Medinacelli Supervisors: Prof. Steffen Hölldobler Emmanuelle-Anna Dietz 1 To my family. Abstract A key research

More information

Integrating Logic Programs and Connectionist Systems

Integrating Logic Programs and Connectionist Systems Integrating Logic Programs and Connectionist Systems Sebastian Bader, Steffen Hölldobler International Center for Computational Logic Technische Universität Dresden Germany Pascal Hitzler Institute of

More information

Neural-Symbolic Integration

Neural-Symbolic Integration Neural-Symbolic Integration A selfcontained introduction Sebastian Bader Pascal Hitzler ICCL, Technische Universität Dresden, Germany AIFB, Universität Karlsruhe, Germany Outline of the Course Introduction

More information

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield 3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE 4: Linear Systems Summary # 3: Introduction to artificial neural networks DISTRIBUTED REPRESENTATION An ANN consists of simple processing units communicating with each other. The basic elements of

More information

We don t have a clue how the mind is working.

We don t have a clue how the mind is working. We don t have a clue how the mind is working. Neural-Symbolic Integration Pascal Hitzler Universität Karlsruhe (TH) Motivation Biological neural networks can easily do logical reasoning. Why is it so difficult

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

The Fuzzy Description Logic ALC F LH

The Fuzzy Description Logic ALC F LH The Fuzzy Description Logic ALC F LH Steffen Hölldobler, Nguyen Hoang Nga Technische Universität Dresden, Germany sh@iccl.tu-dresden.de, hoangnga@gmail.com Tran Dinh Khang Hanoi University of Technology,

More information

31.1.1Partial derivatives

31.1.1Partial derivatives Module 11 : Partial derivatives, Chain rules, Implicit differentiation, Gradient, Directional derivatives Lecture 31 : Partial derivatives [Section 31.1] Objectives In this section you will learn the following

More information

Chapter 5 Synchronous Sequential Logic

Chapter 5 Synchronous Sequential Logic Chapter 5 Synchronous Sequential Logic Sequential circuit: A circuit that includes memory elements. In this case the output depends not only on the current input but also on the past inputs. Memory A synchronous

More information

Logical Time. 1. Introduction 2. Clock and Events 3. Logical (Lamport) Clocks 4. Vector Clocks 5. Efficient Implementation

Logical Time. 1. Introduction 2. Clock and Events 3. Logical (Lamport) Clocks 4. Vector Clocks 5. Efficient Implementation Logical Time Nicola Dragoni Embedded Systems Engineering DTU Compute 1. Introduction 2. Clock and Events 3. Logical (Lamport) Clocks 4. Vector Clocks 5. Efficient Implementation 2013 ACM Turing Award:

More information

Ecient Higher-order Neural Networks. for Classication and Function Approximation. Joydeep Ghosh and Yoan Shin. The University of Texas at Austin

Ecient Higher-order Neural Networks. for Classication and Function Approximation. Joydeep Ghosh and Yoan Shin. The University of Texas at Austin Ecient Higher-order Neural Networks for Classication and Function Approximation Joydeep Ghosh and Yoan Shin Department of Electrical and Computer Engineering The University of Texas at Austin Austin, TX

More information

Recurrent Neural Networks and Logic Programs

Recurrent Neural Networks and Logic Programs Recurrent Neural Networks and Logic Programs The Very Idea Propositional Logic Programs Propositional Logic Programs and Learning Propositional Logic Programs and Modalities First Order Logic Programs

More information

CHAPTER 6 : LITERATURE REVIEW

CHAPTER 6 : LITERATURE REVIEW CHAPTER 6 : LITERATURE REVIEW Chapter : LITERATURE REVIEW 77 M E A S U R I N G T H E E F F I C I E N C Y O F D E C I S I O N M A K I N G U N I T S A B S T R A C T A n o n l i n e a r ( n o n c o n v e

More information

P E R E N C O - C H R I S T M A S P A R T Y

P E R E N C O - C H R I S T M A S P A R T Y L E T T I C E L E T T I C E I S A F A M I L Y R U N C O M P A N Y S P A N N I N G T W O G E N E R A T I O N S A N D T H R E E D E C A D E S. B A S E D I N L O N D O N, W E H A V E T H E P E R F E C T R

More information

Chapter 7. Sequential Circuits Registers, Counters, RAM

Chapter 7. Sequential Circuits Registers, Counters, RAM Chapter 7. Sequential Circuits Registers, Counters, RAM Register - a group of binary storage elements suitable for holding binary info A group of FFs constitutes a register Commonly used as temporary storage

More information

Logic Programs and Connectionist Networks

Logic Programs and Connectionist Networks Wright State University CORE Scholar Computer Science and Engineering Faculty Publications Computer Science and Engineering 9-2004 Logic Programs and Connectionist Networks Pascal Hitzler pascal.hitzler@wright.edu

More information

ELE2120 Digital Circuits and Systems. Tutorial Note 9

ELE2120 Digital Circuits and Systems. Tutorial Note 9 ELE2120 Digital Circuits and Systems Tutorial Note 9 Outline 1. Exercise(1) Sequential Circuit Analysis 2. Exercise (2) Sequential Circuit Analysis 3. Exercise (3) Sequential Circuit Analysis 4. Ref. Construction

More information

Simple neuron model Components of simple neuron

Simple neuron model Components of simple neuron Outline 1. Simple neuron model 2. Components of artificial neural networks 3. Common activation functions 4. MATLAB representation of neural network. Single neuron model Simple neuron model Components

More information

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge

More information

Sequential Circuits Sequential circuits combinational circuits state gate delay

Sequential Circuits Sequential circuits combinational circuits state gate delay Sequential Circuits Sequential circuits are those with memory, also called feedback. In this, they differ from combinational circuits, which have no memory. The stable output of a combinational circuit

More information

Ability to Count Messages Is Worth Θ( ) Rounds in Distributed Computing

Ability to Count Messages Is Worth Θ( ) Rounds in Distributed Computing Ability to Count Messages Is Worth Θ( ) Rounds in Distributed Computing Tuomo Lempiäinen Aalto University, Finland LICS 06 July 7, 06 @ New York / 0 Outline Introduction to distributed computing Different

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

Digital Circuits ECS 371

Digital Circuits ECS 371 Digital Circuits ECS 371 Dr. Prapun Suksompong prapun@siit.tu.ac.th Lecture 18 Office Hours: BKD 3601-7 Monday 9:00-10:30, 1:30-3:30 Tuesday 10:30-11:30 1 Announcement Reading Assignment: Chapter 7: 7-1,

More information

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions BACK-PROPAGATION NETWORKS Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks Cannot approximate (learn) non-linear functions Difficult (if not impossible) to design

More information

6 Synchronous State Machine Design

6 Synchronous State Machine Design Design of synchronous counters. Based on the description of the problem, determine the required number n of the FFs - the smallest value of n is such that the number of states N 2 n and the desired counting

More information

CprE 281: Digital Logic

CprE 281: Digital Logic CprE 281: Digital Logic Instructor: Alexander Stoytchev http://www.ece.iastate.edu/~alexs/classes/ Synchronous Sequential Circuits Basic Design Steps CprE 281: Digital Logic Iowa State University, Ames,

More information

COE 202: Digital Logic Design Sequential Circuits Part 3. Dr. Ahmad Almulhem ahmadsm AT kfupm Phone: Office:

COE 202: Digital Logic Design Sequential Circuits Part 3. Dr. Ahmad Almulhem   ahmadsm AT kfupm Phone: Office: COE 202: Digital Logic Design Sequential Circuits Part 3 Dr. Ahmad Almulhem Email: ahmadsm AT kfupm Phone: 860-7554 Office: 22-324 Objectives Important Design Concepts State Reduction and Assignment Design

More information

Counters. We ll look at different kinds of counters and discuss how to build them

Counters. We ll look at different kinds of counters and discuss how to build them Counters We ll look at different kinds of counters and discuss how to build them These are not only examples of sequential analysis and design, but also real devices used in larger circuits 1 Introducing

More information

King Fahd University of Petroleum and Minerals College of Computer Science and Engineering Computer Engineering Department

King Fahd University of Petroleum and Minerals College of Computer Science and Engineering Computer Engineering Department King Fahd University of Petroleum and Minerals College of Computer Science and Engineering Computer Engineering Department Page 1 of 13 COE 202: Digital Logic Design (3-0-3) Term 112 (Spring 2012) Final

More information

Lecture 10: Synchronous Sequential Circuits Design

Lecture 10: Synchronous Sequential Circuits Design Lecture 0: Synchronous Sequential Circuits Design. General Form Input Combinational Flip-flops Combinational Output Circuit Circuit Clock.. Moore type has outputs dependent only on the state, e.g. ripple

More information

Causality and physical time

Causality and physical time Logical Time Causality and physical time Causality is fundamental to the design and analysis of parallel and distributed computing and OS. Distributed algorithms design Knowledge about the progress Concurrency

More information

CptS 464/564 Fall Prof. Dave Bakken. Cpt. S 464/564 Lecture January 26, 2014

CptS 464/564 Fall Prof. Dave Bakken. Cpt. S 464/564 Lecture January 26, 2014 Overview of Ordering and Logical Time Prof. Dave Bakken Cpt. S 464/564 Lecture January 26, 2014 Context This material is NOT in CDKB5 textbook Rather, from second text by Verissimo and Rodrigues, chapters

More information

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i ) Symmetric Networks Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). How can we model an associative memory? Let M = {v 1,..., v m } be a

More information

BROWN UNIVERSITY PROBLEM SET 4 INSTRUCTOR: SAMUEL S. WATSON DUE: 6 OCTOBER 2017

BROWN UNIVERSITY PROBLEM SET 4 INSTRUCTOR: SAMUEL S. WATSON DUE: 6 OCTOBER 2017 BROWN UNIVERSITY PROBLEM SET 4 INSTRUCTOR: SAMUEL S. WATSON DUE: 6 OCTOBER 2017 Print out these pages, including the additional space at the end, and complete the problems by hand. Then use Gradescope

More information

Stochastic Networks Variations of the Hopfield model

Stochastic Networks Variations of the Hopfield model 4 Stochastic Networks 4. Variations of the Hopfield model In the previous chapter we showed that Hopfield networks can be used to provide solutions to combinatorial problems that can be expressed as the

More information

Different encodings generate different circuits

Different encodings generate different circuits FSM State Encoding Different encodings generate different circuits no easy way to find best encoding with fewest logic gates or shortest propagation delay. Binary encoding: K states need log 2 K bits i.e.,

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Week 4: Hopfield Network

Week 4: Hopfield Network Week 4: Hopfield Network Phong Le, Willem Zuidema November 20, 2013 Last week we studied multi-layer perceptron, a neural network in which information is only allowed to transmit in one direction (from

More information

Timing Analysis with Clock Skew

Timing Analysis with Clock Skew , Mark Horowitz 1, & Dean Liu 1 David_Harris@hmc.edu, {horowitz, dliu}@vlsi.stanford.edu March, 1999 Harvey Mudd College Claremont, CA 1 (with Stanford University, Stanford, CA) Outline Introduction Timing

More information

Hilbert s Nullstellensatz

Hilbert s Nullstellensatz Hilbert s Nullstellensatz An Introduction to Algebraic Geometry Scott Sanderson Department of Mathematics Williams College April 6, 2013 Introduction My talk today is on Hilbert s Nullstellensatz, a foundational

More information

3. Complete the following table of equivalent values. Use binary numbers with a sign bit and 7 bits for the value

3. Complete the following table of equivalent values. Use binary numbers with a sign bit and 7 bits for the value EGC22 Digital Logic Fundamental Additional Practice Problems. Complete the following table of equivalent values. Binary. Octal 35.77 33.23.875 29.99 27 9 64 Hexadecimal B.3 D.FD B.4C 2. Calculate the following

More information

Iterative Autoassociative Net: Bidirectional Associative Memory

Iterative Autoassociative Net: Bidirectional Associative Memory POLYTECHNIC UNIVERSITY Department of Computer and Information Science Iterative Autoassociative Net: Bidirectional Associative Memory K. Ming Leung Abstract: Iterative associative neural networks are introduced.

More information

The Derivative of a Function Measuring Rates of Change of a function. Secant line. f(x) f(x 0 ) Average rate of change of with respect to over,

The Derivative of a Function Measuring Rates of Change of a function. Secant line. f(x) f(x 0 ) Average rate of change of with respect to over, The Derivative of a Function Measuring Rates of Change of a function y f(x) f(x 0 ) P Q Secant line x 0 x x Average rate of change of with respect to over, " " " " - Slope of secant line through, and,

More information

Extracting Reduced Logic Programs from Artificial Neural Networks

Extracting Reduced Logic Programs from Artificial Neural Networks Extracting Reduced Logic Programs from Artificial Neural Networks Jens Lehmann 1, Sebastian Bader 2, Pascal Hitzler 3 1 Department of Computer Science, Technische Universität Dresden, Germany 2 International

More information

Chapter 3 Supervised learning:

Chapter 3 Supervised learning: Chapter 3 Supervised learning: Multilayer Networks I Backpropagation Learning Architecture: Feedforward network of at least one layer of non-linear hidden nodes, e.g., # of layers L 2 (not counting the

More information

Mathematical Logic (IX)

Mathematical Logic (IX) Mathematical Logic (IX) Yijia Chen 1. The Löwenheim-Skolem Theorem and the Compactness Theorem Using the term-interpretation, it is routine to verify: Theorem 1.1 (Löwenheim-Skolem). Let Φ L S be at most

More information

Neural Networks (Part 1) Goals for the lecture

Neural Networks (Part 1) Goals for the lecture Neural Networks (Part ) Mark Craven and David Page Computer Sciences 760 Spring 208 www.biostat.wisc.edu/~craven/cs760/ Some of the slides in these lectures have been adapted/borrowed from materials developed

More information

Multiclass Classification-1

Multiclass Classification-1 CS 446 Machine Learning Fall 2016 Oct 27, 2016 Multiclass Classification Professor: Dan Roth Scribe: C. Cheng Overview Binary to multiclass Multiclass SVM Constraint classification 1 Introduction Multiclass

More information

What does Bayes theorem give us? Lets revisit the ball in the box example.

What does Bayes theorem give us? Lets revisit the ball in the box example. ECE 6430 Pattern Recognition and Analysis Fall 2011 Lecture Notes - 2 What does Bayes theorem give us? Lets revisit the ball in the box example. Figure 1: Boxes with colored balls Last class we answered

More information

Neural Networks Lecture 3:Multi-Layer Perceptron

Neural Networks Lecture 3:Multi-Layer Perceptron Neural Networks Lecture 3:Multi-Layer Perceptron H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural

More information

Active Learning: Disagreement Coefficient

Active Learning: Disagreement Coefficient Advanced Course in Machine Learning Spring 2010 Active Learning: Disagreement Coefficient Handouts are jointly prepared by Shie Mannor and Shai Shalev-Shwartz In previous lectures we saw examples in which

More information

Modal Logic of Forcing Classes

Modal Logic of Forcing Classes Outline CUNY Graduate Center Department of Mathematics March 11, 2016 Outline Outline 1 Outline 1 Modal Logic Background Modal Axioms K (ϕ ψ) ( ϕ ψ) T ϕ ϕ 4 ϕ ϕ.2 ϕ ϕ.3 ( ϕ ψ) [(ϕ ψ) (ψ ϕ)] 5 ϕ ϕ Modal

More information

ELCT201: DIGITAL LOGIC DESIGN

ELCT201: DIGITAL LOGIC DESIGN ELCT201: DIGITAL LOGIC DESIGN Dr. Eng. Haitham Omran, haitham.omran@guc.edu.eg Dr. Eng. Wassim Alexan, wassim.joseph@guc.edu.eg Lecture 6 Following the slides of Dr. Ahmed H. Madian محرم 1439 ه Winter

More information

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. COGS Q250 Fall 2012 Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. For the first two questions of the homework you will need to understand the learning algorithm using the delta

More information

Lecture 4 Towards Deep Learning

Lecture 4 Towards Deep Learning Lecture 4 Towards Deep Learning (January 30, 2015) Mu Zhu University of Waterloo Deep Network Fields Institute, Toronto, Canada 2015 by Mu Zhu 2 Boltzmann Distribution probability distribution for a complex

More information

Lecture (08) Synchronous Sequential Logic

Lecture (08) Synchronous Sequential Logic Lecture (08) Synchronous Sequential Logic By: Dr. Ahmed ElShafee ١ Dr. Ahmed ElShafee, ACU : Spring 2018, CSE303 Logic design II Analysis of Clocked Sequential Circuits The behavior of a clocked sequential

More information

The Multi-Agent Rendezvous Problem - The Asynchronous Case

The Multi-Agent Rendezvous Problem - The Asynchronous Case 43rd IEEE Conference on Decision and Control December 14-17, 2004 Atlantis, Paradise Island, Bahamas WeB03.3 The Multi-Agent Rendezvous Problem - The Asynchronous Case J. Lin and A.S. Morse Yale University

More information

Introduction to the GRAPE Algorithm

Introduction to the GRAPE Algorithm June 8, 2010 Reference: J. Mag. Res. 172, 296 (2005) Journal of Magnetic Resonance 172 (2005) 296 305 www.elsevier.com/locate/jmr Optimal control of coupled spin dynamics: design of NMR pulse sequences

More information

A Little Logic. Propositional Logic. Satisfiability Problems. Solving Sudokus. First Order Logic. Logic Programming

A Little Logic. Propositional Logic. Satisfiability Problems. Solving Sudokus. First Order Logic. Logic Programming A Little Logic International Center for Computational Logic Technische Universität Dresden Germany Propositional Logic Satisfiability Problems Solving Sudokus First Order Logic Logic Programming A Little

More information

Chapter 7 Sequential Logic

Chapter 7 Sequential Logic Chapter 7 Sequential Logic SKEE2263 Digital Systems Mun im/ismahani/izam {munim@utm.my,e-izam@utm.my,ismahani@fke.utm.my} March 28, 2016 Table of Contents 1 Intro 2 Bistable Circuits 3 FF Characteristics

More information

416 Distributed Systems. Time Synchronization (Part 2: Lamport and vector clocks) Jan 27, 2017

416 Distributed Systems. Time Synchronization (Part 2: Lamport and vector clocks) Jan 27, 2017 416 Distributed Systems Time Synchronization (Part 2: Lamport and vector clocks) Jan 27, 2017 1 Important Lessons (last lecture) Clocks on different systems will always behave differently Skew and drift

More information

PICARD S THEOREM STEFAN FRIEDL

PICARD S THEOREM STEFAN FRIEDL PICARD S THEOREM STEFAN FRIEDL Abstract. We give a summary for the proof of Picard s Theorem. The proof is for the most part an excerpt of [F]. 1. Introduction Definition. Let U C be an open subset. A

More information

Ch.8 Neural Networks

Ch.8 Neural Networks Ch.8 Neural Networks Hantao Zhang http://www.cs.uiowa.edu/ hzhang/c145 The University of Iowa Department of Computer Science Artificial Intelligence p.1/?? Brains as Computational Devices Motivation: Algorithms

More information

CHAPTER I THE RIESZ REPRESENTATION THEOREM

CHAPTER I THE RIESZ REPRESENTATION THEOREM CHAPTER I THE RIESZ REPRESENTATION THEOREM We begin our study by identifying certain special kinds of linear functionals on certain special vector spaces of functions. We describe these linear functionals

More information

Non-Work-Conserving Non-Preemptive Scheduling: Motivations, Challenges, and Potential Solutions

Non-Work-Conserving Non-Preemptive Scheduling: Motivations, Challenges, and Potential Solutions Non-Work-Conserving Non-Preemptive Scheduling: Motivations, Challenges, and Potential Solutions Mitra Nasri Chair of Real-time Systems, Technische Universität Kaiserslautern, Germany nasri@eit.uni-kl.de

More information

Week 5: Logistic Regression & Neural Networks

Week 5: Logistic Regression & Neural Networks Week 5: Logistic Regression & Neural Networks Instructor: Sergey Levine 1 Summary: Logistic Regression In the previous lecture, we covered logistic regression. To recap, logistic regression models and

More information

CSC321 Lecture 4: Learning a Classifier

CSC321 Lecture 4: Learning a Classifier CSC321 Lecture 4: Learning a Classifier Roger Grosse Roger Grosse CSC321 Lecture 4: Learning a Classifier 1 / 28 Overview Last time: binary classification, perceptron algorithm Limitations of the perceptron

More information

Deterministic ω-automata for LTL: A safraless, compositional, and mechanically verified construction

Deterministic ω-automata for LTL: A safraless, compositional, and mechanically verified construction Deterministic ω-automata for LTL: A safraless, compositional, and mechanically verified construction Javier Esparza 1 Jan Křetínský 2 Salomon Sickert 1 1 Fakultät für Informatik, Technische Universität

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a

Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a Some slides are due to Christopher Bishop Limitations of K-means Hard assignments of data points to clusters small shift of a

More information

Lehrstuhl B für Mechanik Technische Universität München D Garching Germany

Lehrstuhl B für Mechanik Technische Universität München D Garching Germany DISPLACEMENT POTENTIALS IN NON-SMOOTH DYNAMICS CH. GLOCKER Lehrstuhl B für Mechanik Technische Universität München D-85747 Garching Germany Abstract. The paper treats the evaluation of the accelerations

More information

Chapter 3 Salient Feature Inference

Chapter 3 Salient Feature Inference Chapter 3 Salient Feature Inference he building block of our computational framework for inferring salient structure is the procedure that simultaneously interpolates smooth curves, or surfaces, or region

More information

Absence of Global Clock

Absence of Global Clock Absence of Global Clock Problem: synchronizing the activities of different part of the system (e.g. process scheduling) What about using a single shared clock? two different processes can see the clock

More information

Sect The Slope-Intercept Form

Sect The Slope-Intercept Form 0 Concepts # and # Sect. - The Slope-Intercept Form Slope-Intercept Form of a line Recall the following definition from the beginning of the chapter: Let a, b, and c be real numbers where a and b are not

More information

Artificial Neural Network and Fuzzy Logic

Artificial Neural Network and Fuzzy Logic Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks

More information

COE 202: Digital Logic Design Sequential Circuits Part 3. Dr. Ahmad Almulhem ahmadsm AT kfupm Phone: Office:

COE 202: Digital Logic Design Sequential Circuits Part 3. Dr. Ahmad Almulhem   ahmadsm AT kfupm Phone: Office: COE 202: Digital Logic Design Sequential Circuits Part 3 Dr. Ahmad Almulhem Email: ahmadsm AT kfupm Phone: 860-7554 Office: 22-324 Objectives State Reduction and Assignment Design of Synchronous Sequential

More information

Sequential vs. Combinational

Sequential vs. Combinational Sequential Circuits Sequential vs. Combinational Combinational Logic: Output depends only on current input TV channel selector (-9) inputs system outputs Sequential Logic: Output depends not only on current

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard

More information

ELCT201: DIGITAL LOGIC DESIGN

ELCT201: DIGITAL LOGIC DESIGN ELCT201: DIGITAL LOGIC DESIGN Dr. Eng. Haitham Omran, haitham.omran@guc.edu.eg Dr. Eng. Wassim Alexan, wassim.joseph@guc.edu.eg Following the slides of Dr. Ahmed H. Madian Lecture 10 محرم 1439 ه Winter

More information

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181. Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität

More information

First-order deduction in neural networks

First-order deduction in neural networks First-order deduction in neural networks Ekaterina Komendantskaya 1 Department of Mathematics, University College Cork, Cork, Ireland e.komendantskaya@mars.ucc.ie Abstract. We show how the algorithm of

More information

Extracting Reduced Logic Programs from Artificial Neural Networks

Extracting Reduced Logic Programs from Artificial Neural Networks Extracting Reduced Logic Programs from Artificial Neural Networks Jens Lehmann 1, Sebastian Bader 2, Pascal Hitzler 3 1 Department of Computer Science, Universität Leipzig, Germany 2 International Center

More information

Internet Routing Example

Internet Routing Example Internet Routing Example Acme Routing Company wants to route traffic over the internet from San Fransisco to New York. It owns some wires that go between San Francisco, Houston, Chicago and New York. The

More information

Message Passing Algorithms and Junction Tree Algorithms

Message Passing Algorithms and Junction Tree Algorithms Message Passing lgorithms and Junction Tree lgorithms Le Song Machine Learning II: dvanced Topics S 8803ML, Spring 2012 Inference in raphical Models eneral form of the inference problem P X 1,, X n Ψ(

More information

Neural Nets and Symbolic Reasoning Hopfield Networks

Neural Nets and Symbolic Reasoning Hopfield Networks Neural Nets and Symbolic Reasoning Hopfield Networks Outline The idea of pattern completion The fast dynamics of Hopfield networks Learning with Hopfield networks Emerging properties of Hopfield networks

More information

Time. Today. l Physical clocks l Logical clocks

Time. Today. l Physical clocks l Logical clocks Time Today l Physical clocks l Logical clocks Events, process states and clocks " A distributed system a collection P of N singlethreaded processes without shared memory Each process p i has a state s

More information

LECTURE 9: THE WHITNEY EMBEDDING THEOREM

LECTURE 9: THE WHITNEY EMBEDDING THEOREM LECTURE 9: THE WHITNEY EMBEDDING THEOREM Historically, the word manifold (Mannigfaltigkeit in German) first appeared in Riemann s doctoral thesis in 1851. At the early times, manifolds are defined extrinsically:

More information

Distributed Algorithms Time, clocks and the ordering of events

Distributed Algorithms Time, clocks and the ordering of events Distributed Algorithms Time, clocks and the ordering of events Alberto Montresor University of Trento, Italy 2016/04/26 This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International

More information

Analysis of Clocked Sequential Circuits

Analysis of Clocked Sequential Circuits Objectives Analysis of Clocked Sequential Circuits The objectives of this lesson are as follows: Analysis of clocked sequential circuits with an example State Reduction with an example State assignment

More information

2. The Concept of Convergence: Ultrafilters and Nets

2. The Concept of Convergence: Ultrafilters and Nets 2. The Concept of Convergence: Ultrafilters and Nets NOTE: AS OF 2008, SOME OF THIS STUFF IS A BIT OUT- DATED AND HAS A FEW TYPOS. I WILL REVISE THIS MATE- RIAL SOMETIME. In this lecture we discuss two

More information

Fuzzy Answer Set semantics for Residuated Logic programs

Fuzzy Answer Set semantics for Residuated Logic programs semantics for Logic Nicolás Madrid & Universidad de Málaga September 23, 2009 Aims of this paper We are studying the introduction of two kinds of negations into residuated : Default negation: This negation

More information

Markov Chains and MCMC

Markov Chains and MCMC Markov Chains and MCMC Markov chains Let S = {1, 2,..., N} be a finite set consisting of N states. A Markov chain Y 0, Y 1, Y 2,... is a sequence of random variables, with Y t S for all points in time

More information

Lecture 14: State Tables, Diagrams, Latches, and Flip Flop

Lecture 14: State Tables, Diagrams, Latches, and Flip Flop EE210: Switching Systems Lecture 14: State Tables, Diagrams, Latches, and Flip Flop Prof. YingLi Tian Nov. 6, 2017 Department of Electrical Engineering The City College of New York The City University

More information

9 Forward-backward algorithm, sum-product on factor graphs

9 Forward-backward algorithm, sum-product on factor graphs Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous

More information

Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 19, Sections 1 5 1 Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10

More information