Principles of Neural Information Theory

Size: px
Start display at page:

Download "Principles of Neural Information Theory"

Transcription

1

2 Principles of Neural Information Theory Computational Neuroscience and Metabolic E ciency James V Stone

3 Title: Principles of Neural Information Theory Author: James V Stone c 2017 Sebtel Press All rights reserved. No part of this book may be reproduced or transmitted in any form without written permission from the author. The author asserts his moral right to be identified as the author of this work in accordance with the Copyright, Designs and Patents Act First Edition, Typeset in L A TEX2". First printing. File: main NeuralInfoTheory v39.tex. ISBN Cover based on photograph of Purkinje cell from mouse cerebellum injected with Lucifer Yellow. Courtesy of National Center for Microscopy and Imaging Research, University of California.

4 For Teleri

5 Contents 1. All That We See Introduction In the Light of Evolution In Search of General Principles Information Theory and Biology An Overview of Chapters Information Theory Introduction Finding a Route, Bit by Bit Information and Entropy Entropy and Uncertainty Maximum Entropy Distributions Channel Capacity Mutual Information The Gaussian Channel E ciency Summary Measuring Neural Information Introduction The Neuron Why Spikes? Neural Information Gaussian Firing Rates Information About What? Does Timing Precision Matter? Rate Codes and Timing Codes Summary Pricing Neural Information Introduction The E ciency-rate Trade O Paying With Spikes Paying With Hardware Paying With Power Optimal Axon Diameters Optimal Distribution of Axon Diameters Axon Diameter and Spike Speed Optimal Mean Firing Rate Optimal Distribution of Firing Rates Optimal Synaptic Conductance Summary

6 5. Encoding Colour Introduction The Eye How Aftere ects Occur The Problem With Colour A Neural Encoding Strategy Encoding Colour Why Aftere ects Occur Measuring Mutual Information Maximising Mutual Information Principal Component Analysis PCA and Mutual Information Evidence for E ciency Summary Encoding Time Introduction Linear Models Neurons and Wine Glasses The LNP Model Estimating LNP Parameters The Predictive Coding Model Estimating Predictive Coding Parameters Predictive Coding and Information Evidence for Predictive Coding Summary Encoding Space Introduction Spatial Frequency Do Ganglion Cells Decorrelate Images? Are Receptive Field Structures Optimal? Predictive Coding of Images Evidence For Predictive Coding Is Receptive Field Spacing Optimal? Summary Encoding Visual Contrast Introduction The Compound Eye Not Wasting Capacity Measuring the Eye s Response Maximum Entropy Encoding E ciency of Maximum Entropy Encoding Summary

7 9. Reading the Neural Code Introduction Bayes Rule Bayesian Decoding Encoding vs Decoding Nonlinear Encoding Linear Decoding Linear Decodability How Bayes Adds Bits Summary The Neural Rubicon Introduction The Darwinian Cost of Metabolic E ciency Crossing the Neural Rubicon Further Reading 179 Appendices A. Glossary 181 B. Mathematical Symbols 185 C. Correlation and Independence 189 D. A Vector Matrix Tutorial 191 E. Neural Information Methods 195 F. The Marginal Value Theorem 201 G. Key Equations 203 References 205 Index 211

8 Preface Some scientists consider the brain to be a collection of heuristics or hacks, which have accumulated over evolutionary history. Others think that the brain relies on a small number of fundamental principles, which underpin the diverse systems within the brain. This book provides a rigorous account of how Shannon s mathematical theory of information can be used to test one such principle, metabolic e ciency. From Words to Mathematics. The methods used to explore metabolic e ciency lie in the realms of mathematical modelling. Mathematical models demand a precision unattainable with purely verbal accounts of brain function. With this precision, comes an equally precise quantitative predictive power. In contrast, the predictions of purely verbal models can be vague, and this vagueness also makes them virtually indestructible, because predictive failures can often be explained away. No such luxury exists for mathematical models. In this respect, mathematical models are easy to test, and if they are weak models then they are easy to disprove. So, in the Darwinian world of mathematical modelling, survivors tend to be few, but those few tend to be supremely fit. This is not to suggest that purely verbal models are always inferior. Such models are a necessary first step in understanding. But continually refining a verbal model into ever more rarefied forms is not scientific progress. Eventually, a purely verbal model should evolve to the point where it can be reformulated as a mathematical model, with predictions that can be tested against measurable physical quantities. Happily, most branches of neuroscience reached this state of scientific maturity some time ago. Signposts. The writing style adopted here in Principles of Neural Information Theory describes the raw science of neural information theory, un-fettered by the conventions of standard textbooks. Accordingly, key concepts are introduced informally, before being described mathematically.

9 However, such an informal style can easily be mis-interpreted as poor scholarship, because informal writing is often sloppy writing. But the way we write is usually only loosely related to the way we speak, when giving a lecture, for example. A good lecture includes many asides and hints about to what is, and is not, important. In contrast, scientific writing is usually formal, bereft of sign-posts about where the main theoretical structures are to be found, and how to avoid the many pitfalls which can mislead the unwary. So, unlike most textbooks, and like the best lectures, this book is intended to be both informal and rigorous, with prominent sign-posts as to where the main insights are to be found, and many warnings about where they are not. Originality. Evidence is presented in the form of scientific papers and books, which are cited using a conventional format (e.g. Smith and Jones (1959)). When the sheer number of cited sources is large, numbered superscripts are used in order to prevent the text from becoming un-readable. Occasionally, facts are presented without evidence, either because they are reasonably self-evident, or because they can be found in standard texts. Consequently, the reader may wonder if the ideas being presented are derived from other scientists, or if they are unique to this book. In such cases, be re-assured that almost none of the ideas in this book belong to the author. Indeed, like most books, this book represents a synthesis of ideas from many sources, but the general approach is inspired principally by these texts: Vision (1981) 79 by Marr, Spikes (1997) 97 by Rieke, Warland, de Ruyter van Steveninck and Bialek, Biophysics (2012) 19 by Bialek, and Principles of Neural Design (2015) 112 by Sterling and Laughlin. In particular, Sterling and Laughlin pointed out that the amount of physiological data being published each year contributes to a growing Data Mountain, which far outstrips the ability of current theories to make sense of those data. Accordingly, whilst this account is not intended to be definitive, it is intended to provide another piton to those established by Sterling and Laughlin on Data Mountain. Who Should Read This Book? The material presented in this book is intended to be accessible to readers with a basic scientific education at undergraduate level. In essence, this book is intended

10 for those who wish to understand how the fundamental ingredients of inert matter, energy and information have been forged by evolution to produce a particularly e cient computational machine, the brain. Understanding how the elements of this triumvirate interact demands knowledge from a wide variety of academic disciplines, but principally from biology and mathematics. Accordingly, reading this book will require some patience from mathematicians to navigate the conceptual shift involved in interpreting signal processing mathematics in terms of neural computation, and it will require sustained e ort from biologists to grasp the mathematical material. Even though some of the mathematics is fairly sophisticated, key concepts are introduced using geometric interpretations and diagrams wherever possible. PowerPoint Slides of Figures. Most of the figures used in this book can be downloaded from Corrections. Please corrections to j.v.stone@she eld.ac.uk. A list of corrections can be found at Acknowledgments. Thanks to the developers of the L A TEX2" software, used to typeset this book. Shashank Vatedka deserves a special mention for checking the mathematics in a final draft of this book. Thanks to SOMEONE for meticulous copy-editing and proofreading. (NOT DONE YET). Thanks to John de Pledge, Royston Sellman, and Steve Snow for many discussions on the role of information in biology, to Patrick Keating for advice on the optics of photoreceptors, to Frederic Theunissen for advice on measuring neural information, and to Mike Land for help with disentangling neural superposition. For reading either individual or all chapters, I am very grateful to David Atwell, Horace Barlow, Ilona Box, Julian Budd, Matthew Crosby, Nikki Hunkin, Simon Laughlin, Raymond Lister, Danielle Matthews, Pasha Parpia, Jenny Read, Jung-Tsung Shen, Tom Sta ord, Eleni Vasilaki, Paul Warren and Stuart Wilson. James V Stone, She eld, England, 2017.

11 To understand life, one has to understand not just the flow of energy, but also the flow of information. W Bialek, 2012.

12 Chapter 1 All That We See When we see, we are not interpreting the pattern of light intensity that falls on our retina; we are interpreting the pattern of spikes that the million cells of our optic nerve send to the brain. Rieke, Warland, De Ruyter van Steveninck, and Bialek, Introduction All that we see begins with an image formed on the eye s retina (Figure 1.1). Initially, this image is recorded by 126 million photoreceptors within the retina. The outputs of these photoreceptors are then repackaged or encoded, via a series of intermediate connections, into a sequence of digital pulses or spikes, that travel through the one million neurons of the optic nerve which connect the eye to the brain. The fact that we see so well suggests that the retina must be extraordinarily accurate when it encodes the image into spikes, and the brain must be equally accurate when it decodes those spikes into all that we see (Figure 1.2). But the eye and brain are not only good at translating the world into spikes, and spikes into perception, they are also e cient at transmitting information from the eye to the brain. Precisely how e cient, is the subject of this book. 1

13 1 All That We See 1.2. In the Light of Evolution In 1973, the evolutionary biologist Dobzhansky famously wrote: Nothing in biology makes sense except in the light of evolution. But evolution has to operate within limits set by the laws of physics, and (as we shall see) the laws of information. Just as a bird cannot fly without obeying the laws of physics, so, a brain cannot function without obeying the laws of information. And, just as the shape of a bird s wing is determined by the laws of physics, so the structure of a neuron is determined by the laws of information. Neurons communicate information, and that is pretty much all that they do. But neurons are energetically expensive to make, maintain, and run 71. Half of a child s total energy budget, and a fifth of an adult s budget, is required just to keep the brain ticking over 108 (Figure 1.3). For children and adults, half the total energy budget of the brain is used for neuronal information processing, and the rest is used for basic maintenance 10. The high cost of using neurons probably accounts for the fact that only 2-4% of them are active at any one time 73. Given that neurons and spikes are so expensive, we should be un-surprised to find that when the visual data from the eye is encoded as a series of spikes, each neuron and each spike conveys information e ciently. In the context of the Darwin-Wallace theory of evolution, it seems self-evident that neurons should be e cient. But, in order to formalise the notion of e ciency, we need a rigorous definition of Darwinian fitness. Even though fitness can be measured in terms of the number of o spring an animal manages to raise to sexual maturity, the connection Re#na Op#c Nerve Lens Figure 1.1. Cross section of eye. Modified from Wikimedia Commons. 2

14 1.2. In the Light of Evolution between neural computation and fitness is di cult to define. In the absence of such a definition, we consider quantities which can act as a plausible proxy for fitness. One such proxy, with a long track-record in neuroscience, is information, which is measured in units of bits. The amount of information an animal can gather from its environment is related to fitness because information in the form of sight, sound and scent ultimately provides food, mates and shelter. However, information comes at a price, paid in neuronal infrastructure and energy. So animals want information, but they want information at a price that will increase their fitness. This means that animals usually want cheap information. It is often said that there is no such thing as a free lunch, which is as true in Nature as it is in New York. If an animal demands that a neuron delivers information at a high rate then the laws of information dictate that the price per bit will be high; a price that is paid in Joules. However, sometimes information is worth having even if it is expensive. For example, visually responsive neurons which alert an animal to the presence of imminent danger (e.g. by identifying a predator) require large amounts of information to be transmitted rapidly, even if the energy cost of transmitting that information through neurons in the visual system is high. Such neurons take full advantage of their potential for transmitting information, so they maximise their coding e ciency. Conversely, if an animal demands that information a Response (spikes) Encoding b Luminance Decoding Reconstructed luminance Time (ms) Figure 1.2. Encoding and decoding (schematic). A rapidly changing luminance (bold curve in b) is encoded as a spike train (a), which is decoded to estimate the luminance (thin curve in b). See Chapters 6 and 9. 3

15 1 All That We See is delivered at the lowest price per bit, with maximal metabolic efficiency 74;112, then those same laws dictate that the information rate must be low. The idea of coding efficiency has a longer history than metabolic efficiency, and has been enshrined as the efficient coding hypothesis, which posits that the way neurons encode sensory data is designed by Nature to transmit as much information as possible. It has been championed over many years by Horace Barlow (1959) 16, amongst others (e.g. Attneave (1954) 9, Atick (1992) 5 ), and has had a substantial influence on computational neuroscience. However, accumulating evidence, summarised in this text, suggests that metabolic efficiency, rather than coding efficiency, may be the dominant influence on Nature s design of neural systems. Both coding efficiency and metabolic efficiency are defined more formally in Section 2.9. We should note that there are a number of di erent computational models which collectively fall under the umbrella term efficient coding. To a first approximation, the results of applying the methods associated with these models tend to be similar 97, even though the methods are di erent. These methods include sparse coding 47, principal component analysis, independent component analysis 18;113, information maximisation (infomax) 75, redundancy reduction 7, and predictive coding 94;110. (a) (b) Figure 1.3. a) The human brain weighs 1300g, contains about 1010 neurons, and consumes 12 Watts of power. The outer surface seen here is the neocortex. b) Each neuron (plus its support structures) therefore accounts for an average of 1200pJ/s (1pJ/s = J/s). From Wikimedia Commons. 4

16 1.3. In Search of General Principles 1.3. In Search of General Principles The test of a theory is not just whether or not it accounts for a body of data, but also how complex the theory is in relation to the complexity of the data being explained. Clearly, if a theory is, in some sense, more convoluted than the phenomena it explains then it is not much of a theory. This is why we favour theories that explain a vast range of phenomena with the minimum of words or equations. A prime example of a parsimonious theory is Newton s theory of gravitation, which explains (amongst other things) how a ball falls to Earth, how atmospheric pressure varies with height above the Earth, and how the Earth orbits the Sun. In essence, we favour theories which rely on a general principle to explain a diverse range of physical phenomena. However, even a theory based on general principles is of little use if it is too vague to be tested rigorously. Accordingly, if we want to understand how the brain works then we need more than a theory which is expressed in mere words. For example, if the theory of gravitation were stated only in words then we could say that each planet has an approximately circular orbit, but we would have to use many words to prove precisely why each orbit must be elliptical, and to state exactly how elliptical each orbit is. In contrast, a few equations express these facts exactly, and without ambiguity. Thus, whereas words are required to provide theoretical context, mathematics imposes a degree of precision which is extremely di cult, if not impossible, to achieve with words alone. To quote Galileo Galilei (1623) The universe is written in this grand book, which stands continually open to our gaze, but it cannot be understood unless one first learns to comprehend the language in which it is written. It is written in the language of mathematics, without which it is humanly impossible to understand a single word of it. In the spirit of Galileo s recommendation, we begin with a quantitative definition of information, in Chapter 2. 5

17 1 All That We See 1.4. Information Theory and Biology Claude Shannon s theory of communication 103 (1948), heralded a transformation in our understanding of information. Before 1948, information was regarded as a kind of miasmic fluid. But afterwards, it became apparent that information is a well-defined and, above all, measurable quantity. Since that time, it has become increasingly apparent that information, and the energy cost of information, imposes fundamental limits on the form and function of biological mechanisms. Shannon s theory provides a mathematical definition of information, and describes precisely how much information can be communicated between di erent elements of a system. This may not sound like much, but Shannon s theory underpins our understanding of why there are definite limits to the rate at which information can be processed within any system, whether man-made or biological. Information theory does not place any conditions on what type of mechanism processes information in order to achive a given objective; in other words, on exactly how it is to be achieved. However, unless there are unlimited amounts of energy available, relatively little information will reach the brain without some form of encoding. In other words, information theory does not specify how any biological function, such as vision, is implemented, but it does set fundamental limits on what is achievable by any physical mechanisms within any visual system. The distinction between a function and the mechanism which implements that function is a cornerstone of David Marr s (1982) 79 computational theory of vision. Marr stressed the need to consider physiological findings in the context of computational models, and his approach is epitomised in a single quote: Trying to understand perception by studying only neurons is like trying to understand bird flight by studying only feathers: It just cannot be done. Even though Marr did not address the role of information theory directly, his analytic approach has served as a source of inspiration, not only for this book, but also for much of the progress made within computational neuroscience. 6

18 1.5. An Overview of Chapters 1.5. An Overview of Chapters This section contains technical terms which are explained fully in the relevant chapters, and in the Glossary (Appendix A, p181). To fully appreciate the importance of information theory for neural computation, some familiarity with the basic elements of information theory is required; these elements are presented in Chapter 2 (which can be skipped on a first reading of the book). In Chapter 3, we use information theory to estimate the amount of information in the output of a spiking neuron, and also to estimate how much of this information (i.e. mutual information) is related to the neuron s input. In Chapter 4, we discover that one of the consequences of information theory (specifically, Shannon s noisy coding theorem) is that the cost of information rises inexorably and disproportionately with information rate. We consider empirical results which suggest that this steep rise accounts for physiological values of axon diameter, the distribution of axon diameters, mean firing rate, and synaptic conductance; values which appear to be tuned to minimise the cost of information. In Chapter 5, we consider how the correlations between the outputs of photoreceptors sensitive to similar colours threaten to reduce information rates, and how this can be ameliorated by synaptic preprocessing in the retina. This pre-processing makes e cient use of the available neuronal inftrastructure to maximise information rates, which explains not only how, but also why, there is a red-green aftere ect, but no red-blue aftere ect. A more formal account involves using principal component analysis to estimate the synaptic connections which maximise neuronal information throughput. In Chapter 6, the lessons learned so far are applied to the problem of encoding time-varying visual inputs. We explore how a standard (LNP) neuron model can be used as a model of physiological neurons. We then introduce a model based on predictive coding, which yields similar results to the LNP model, and we consider how predictive coding represents a biologically plausible candidate for maximising information rates, and minimising the cost of information. In Chapter 7, we consider how information theory predicts the receptive field structures of retinal ganglion cells across a range of 7

19 1 All That We See luminance conditions. Evidence is presented that these receptive field structures are also obtained using predictive coding, and the information-theoretic analysis applied to time-varying visual inputs (in Chapter 6) is extended to the spatial domain (i.e. retinal images). Once colour, temporal and spatial structure has been encoded by a neuron, the resultant signals must pass through the neuron s non-linear input/output (transfer) function. Accordingly, Chapter 8 is based on a classic paper by Simon Laughlin (1981) 68,whichpredictstheprecise form that this transfer function should adopt in order to maximise information throughput; a form which matches the transfer function found in visual neurons. Having considered how neurons encode sensory inputs, the problem of how the brain decodes neuronal outputs is addressed in Chapter 9; where the importance of prior knowledge, or experience, is explored in the context of Bayes theorem. We also consider how often a neuron should produce a spike so that each spike conveys as much information as possible, and we discover that the answer involves a vital property of e cient communication (i.e. linear decodability). A fundamental tenet of the computational approach adopted in this text is that, within each chapter, we explore particular neuronal mechanisms, how they work, and (most importantly) why they work in the way they do. As a result, each chapter documents evidence that the design of biological mechanisms is determined largely by the need to process information e ciently. 8

20 Dr James Stone is an Honorary Reader in Vision and Computational Neuroscience at the University of Sheffield, England. Seeing The Computational Approach to Biological Vision second edition John P. Frisby and James V. Stone

Do Neurons Process Information Efficiently?

Do Neurons Process Information Efficiently? Do Neurons Process Information Efficiently? James V Stone, University of Sheffield Claude Shannon, 1916-2001 Nothing in biology makes sense except in the light of evolution. Theodosius Dobzhansky, 1973.

More information

Principles of Neural Information Theory

Principles of Neural Information Theory Principles of Neural Information Theory Computational Neuroscience and the E cient Coding Hypothesis James V Stone Title: Principles of Neural Information Theory Author: James V Stone c 2017 Sebtel Press

More information

encoding and estimation bottleneck and limits to visual fidelity

encoding and estimation bottleneck and limits to visual fidelity Retina Light Optic Nerve photoreceptors encoding and estimation bottleneck and limits to visual fidelity interneurons ganglion cells light The Neural Coding Problem s(t) {t i } Central goals for today:

More information

Adaptation in the Neural Code of the Retina

Adaptation in the Neural Code of the Retina Adaptation in the Neural Code of the Retina Lens Retina Fovea Optic Nerve Optic Nerve Bottleneck Neurons Information Receptors: 108 95% Optic Nerve 106 5% After Polyak 1941 Visual Cortex ~1010 Mean Intensity

More information

Efficient Coding. Odelia Schwartz 2017

Efficient Coding. Odelia Schwartz 2017 Efficient Coding Odelia Schwartz 2017 1 Levels of modeling Descriptive (what) Mechanistic (how) Interpretive (why) 2 Levels of modeling Fitting a receptive field model to experimental data (e.g., using

More information

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Yan Karklin and Eero P. Simoncelli NYU Overview Efficient coding is a well-known objective for the evaluation and

More information

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35 1 / 35 Information Theory Mark van Rossum School of Informatics, University of Edinburgh January 24, 2018 0 Version: January 24, 2018 Why information theory 2 / 35 Understanding the neural code. Encoding

More information

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.

More information

Real Science-4-Kids. Level I. Dr. R. W. Keller

Real Science-4-Kids. Level I. Dr. R. W. Keller Real Science-4-Kids Level I Dr. R. W. Keller Cover design: David Keller Opening page: David Keller Illustrations: Janet Moneymaker, Rebecca Keller Copyright 2004 Gravitas Publications, Inc. All rights

More information

Rebecca W. Keller, Ph.D.

Rebecca W. Keller, Ph.D. Real Science-4-Kids Rebecca W. Keller, Ph.D. Cover design: David Keller Opening page: David Keller Illustrations: Janet Moneymaker, Rebecca W. Keller, Ph.D. Copyright 2004, 2005, 2007, 2008, 2010 Gravitas

More information

Introduction. Chapter What is this book about?

Introduction. Chapter What is this book about? Chapter 1 Introduction 1.1 What is this book about? This book is about how to construct and use computational models of specific parts of the nervous system, such as a neuron, a part of a neuron or a network

More information

Information Theory (Information Theory by J. V. Stone, 2015)

Information Theory (Information Theory by J. V. Stone, 2015) Information Theory (Information Theory by J. V. Stone, 2015) Claude Shannon (1916 2001) Shannon, C. (1948). A mathematical theory of communication. Bell System Technical Journal, 27:379 423. A mathematical

More information

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent

More information

Spatial Vision: Primary Visual Cortex (Chapter 3, part 1)

Spatial Vision: Primary Visual Cortex (Chapter 3, part 1) Spatial Vision: Primary Visual Cortex (Chapter 3, part 1) Lecture 6 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Princeton University, Spring 2015 1 Chapter 2 remnants 2 Receptive field:

More information

A Mathematical Theory of Communication

A Mathematical Theory of Communication A Mathematical Theory of Communication Ben Eggers Abstract This paper defines information-theoretic entropy and proves some elementary results about it. Notably, we prove that given a few basic assumptions

More information

Mid Year Project Report: Statistical models of visual neurons

Mid Year Project Report: Statistical models of visual neurons Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons

More information

Bayes Rule: A Tutorial Introduction ( www45w9klk)

Bayes Rule: A Tutorial Introduction (  www45w9klk) Bayes Rule: A Tutorial Introduction (https://docs.google.com/document/pub?id=1qm4hj4xmmowfvsklgcqpudojya5qcnii_ www45w9klk) by JV Stone 1 Introduction All decisions are based on data, but the best decisions

More information

Estimation of information-theoretic quantities

Estimation of information-theoretic quantities Estimation of information-theoretic quantities Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk November 16, 2004 Some

More information

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise 9 THEORY OF CODES Chapter 9 Theory of Codes After studying this chapter you should understand what is meant by noise, error detection and correction; be able to find and use the Hamming distance for a

More information

Modeling and Characterization of Neural Gain Control. Odelia Schwartz. A dissertation submitted in partial fulfillment

Modeling and Characterization of Neural Gain Control. Odelia Schwartz. A dissertation submitted in partial fulfillment Modeling and Characterization of Neural Gain Control by Odelia Schwartz A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Center for Neural Science

More information

Nonlinear Dynamics And Chaos PDF

Nonlinear Dynamics And Chaos PDF Nonlinear Dynamics And Chaos PDF This textbook is aimed at newcomers to nonlinear dynamics and chaos, especially students taking a first course in the subject. The presentation stresses analytical methods,

More information

Understanding Aha! Moments

Understanding Aha! Moments Understanding Aha! Moments Hermish Mehta Department of Electrical Engineering & Computer Sciences University of California, Berkeley Berkeley, CA 94720 hermish@berkeley.edu Abstract In this paper, we explore

More information

Galileo Educator Network

Galileo Educator Network Galileo Educator Network D1.3 Moons of Jupiter (1 hour and 45 minutes + 15 minute Break) 1. Observing Jupiter s Moons (15 minutes) Explain how Galileo used the telescope to learn more about objects in

More information

Information Theory. A Tutorial Introduction. James V Stone

Information Theory. A Tutorial Introduction. James V Stone Information Theory A Tutorial Introduction James V Stone Information Theory: A Tutorial Introduction Author: James V Stone Published by Sebtel Press All rights reserved. No part of this book may be reproduced

More information

September 16, 2004 The NEURON Book: Chapter 2

September 16, 2004 The NEURON Book: Chapter 2 Chapter 2 The ing perspective This and the following chapter deal with concepts that are not NEURON-specific but instead pertain equally well to any tools used for neural ing. Why? In order to achieve

More information

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes. 5 Binary Codes You have already seen how check digits for bar codes (in Unit 3) and ISBN numbers (Unit 4) are used to detect errors. Here you will look at codes relevant for data transmission, for example,

More information

Mind Association. Oxford University Press and Mind Association are collaborating with JSTOR to digitize, preserve and extend access to Mind.

Mind Association. Oxford University Press and Mind Association are collaborating with JSTOR to digitize, preserve and extend access to Mind. Mind Association Response to Colyvan Author(s): Joseph Melia Source: Mind, New Series, Vol. 111, No. 441 (Jan., 2002), pp. 75-79 Published by: Oxford University Press on behalf of the Mind Association

More information

80% of all excitatory synapses - at the dendritic spines.

80% of all excitatory synapses - at the dendritic spines. Dendritic Modelling Dendrites (from Greek dendron, tree ) are the branched projections of a neuron that act to conduct the electrical stimulation received from other cells to and from the cell body, or

More information

Elementary Linear Algebra, Second Edition, by Spence, Insel, and Friedberg. ISBN Pearson Education, Inc., Upper Saddle River, NJ.

Elementary Linear Algebra, Second Edition, by Spence, Insel, and Friedberg. ISBN Pearson Education, Inc., Upper Saddle River, NJ. 2008 Pearson Education, Inc., Upper Saddle River, NJ. All rights reserved. APPENDIX: Mathematical Proof There are many mathematical statements whose truth is not obvious. For example, the French mathematician

More information

Procedure for Setting Goals for an Introductory Physics Class

Procedure for Setting Goals for an Introductory Physics Class Procedure for Setting Goals for an Introductory Physics Class Pat Heller, Ken Heller, Vince Kuo University of Minnesota Important Contributions from Tom Foster, Francis Lawrenz Details at http://groups.physics.umn.edu/physed

More information

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi Lecture - 41 Pulse Code Modulation (PCM) So, if you remember we have been talking

More information

Reinterpreting Newton s Law of Gravitation. Copyright 2012 Joseph A. Rybczyk

Reinterpreting Newton s Law of Gravitation. Copyright 2012 Joseph A. Rybczyk Reinterpreting Newton s Law of Gravitation Copyright 2012 Joseph A. Rybczyk Abstract A new approach in interpreting Newton s law of gravitation leads to an improved understanding of the manner in which

More information

Population Coding in Retinal Ganglion Cells

Population Coding in Retinal Ganglion Cells Population Coding in Retinal Ganglion Cells Reza Abbasi Asl Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-218-23 http://www2.eecs.berkeley.edu/pubs/techrpts/218/eecs-218-23.html

More information

THE L.I.F.E. PLAN CREATION DAY 2 BLOCK 1. THEME 2 - CREATION - PART 1 LESSON 3 (7 of 216)

THE L.I.F.E. PLAN CREATION DAY 2 BLOCK 1. THEME 2 - CREATION - PART 1 LESSON 3 (7 of 216) THE L.I.F.E. PLAN CREATION DAY 2 BLOCK 1 THEME 2 - CREATION - PART 1 LESSON 3 (7 of 216) BLOCK 1 THEME 2: CREATION - PART 1 LESSON 3 (7 OF 216): CREATION DAY 2 LESSON AIM: Show that which God created on

More information

Fracture Mechanics: Fundamentals And Applications, Third Edition Free Pdf Books

Fracture Mechanics: Fundamentals And Applications, Third Edition Free Pdf Books Fracture Mechanics: Fundamentals And Applications, Third Edition Free Pdf Books With its combination of practicality, readability, and rigor that is characteristic of any truly authoritative reference

More information

Lectures on Medical Biophysics Department of Biophysics, Medical Faculty, Masaryk University in Brno. Biocybernetics

Lectures on Medical Biophysics Department of Biophysics, Medical Faculty, Masaryk University in Brno. Biocybernetics Lectures on Medical Biophysics Department of Biophysics, Medical Faculty, Masaryk University in Brno Norbert Wiener 26.11.1894-18.03.1964 Biocybernetics Lecture outline Cybernetics Cybernetic systems Feedback

More information

AS/NZS ISO :2015

AS/NZS ISO :2015 Australian/New Zealand Standard Geographic information Reference model Part 1: Fundamentals Superseding AS/NZS ISO 19101:2003 AS/NZS ISO 19101.1:2015 (ISO 19101-1:2014, IDT) AS/NZS ISO 19101.1:2015 This

More information

Chapter 1. Preliminaries

Chapter 1. Preliminaries Chapter 1 Preliminaries 1.1 The Vector Concept Revisited The concept of a vector has been one of the most fruitful ideas in all of mathematics, and it is not surprising that we receive repeated exposure

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

What is the neural code? Sekuler lab, Brandeis

What is the neural code? Sekuler lab, Brandeis What is the neural code? Sekuler lab, Brandeis What is the neural code? What is the neural code? Alan Litke, UCSD What is the neural code? What is the neural code? What is the neural code? Encoding: how

More information

Course: Zoology Course Number: Title: Zoology, 6 th Edition Authors: Miller, Harley Publisher: Glencoe/McGraw-Hill Copyright: 2005

Course: Zoology Course Number: Title: Zoology, 6 th Edition Authors: Miller, Harley Publisher: Glencoe/McGraw-Hill Copyright: 2005 Course: Zoology Course Number: 2000410 Title: Zoology, 6 th Edition Authors: Miller, Harley Publisher: Glencoe/McGraw-Hill Copyright: 2005 Online Resources used in Correlations These resources are made

More information

arxiv: v2 [cs.it] 20 Feb 2018

arxiv: v2 [cs.it] 20 Feb 2018 Information Theory: A Tutorial Introduction James V Stone, Psychology Department, University of Sheffield, England. j.v.stone@sheffield.ac.uk File: main InformationTheory JVStone v3.tex arxiv:82.5968v2

More information

Natural Image Statistics and Neural Representations

Natural Image Statistics and Neural Representations Natural Image Statistics and Neural Representations Michael Lewicki Center for the Neural Basis of Cognition & Department of Computer Science Carnegie Mellon University? 1 Outline 1. Information theory

More information

Chapter 9: The Perceptron

Chapter 9: The Perceptron Chapter 9: The Perceptron 9.1 INTRODUCTION At this point in the book, we have completed all of the exercises that we are going to do with the James program. These exercises have shown that distributed

More information

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl Spiking neurons Membrane equation V GNa GK GCl Cm VNa VK VCl dv dt + V = V Na G Na + V K G K + V Cl G Cl G total G total = G Na + G K + G Cl = C m G total Membrane with synaptic inputs V Gleak GNa GK

More information

Section III. Biochemical and Physiological Adaptations

Section III. Biochemical and Physiological Adaptations Section III Biochemical and Physiological Adaptations Introduction S.N. ARCHER and M.B.A. DJAMGOZ For a sensory system to function optimally, it must be adapted to receiving and responding to specific

More information

PHY3H. (JUn11PHY3H01) General Certificate of Secondary Education Higher Tier June Unit Physics P3. Written Paper TOTAL. Time allowed 45 minutes

PHY3H. (JUn11PHY3H01) General Certificate of Secondary Education Higher Tier June Unit Physics P3. Written Paper TOTAL. Time allowed 45 minutes Centre Number Surname Candidate Number For Examiner s Use Other Names Candidate Signature Examiner s Initials Physics General Certificate of Secondary Education Higher Tier June 2011 PHY3H Question 1 2

More information

theory of quantum computation communication and cryptography 7th conference tqc 2012 tokyo japan may revised selected papers

theory of quantum computation communication and cryptography 7th conference tqc 2012 tokyo japan may revised selected papers DOWNLOAD OR READ : THEORY OF QUANTUM COMPUTATION COMMUNICATION AND CRYPTOGRAPHY 7TH CONFERENCE TQC 2012 TOKYO JAPAN MAY 17 19 2012 REVISED SELECTED PAPERS PDF EBOOK EPUB MOBI Page 1 Page 2 tokyo japan

More information

Chapter 1 Introduction

Chapter 1 Introduction Chapter 1 Introduction 1.1 Introduction to Chapter This chapter starts by describing the problems addressed by the project. The aims and objectives of the research are outlined and novel ideas discovered

More information

Thermal Physics. Energy and Entropy

Thermal Physics. Energy and Entropy Thermal Physics Energy and Entropy Written by distinguished physics educator, this fresh introduction to thermodynamics, statistical mechanics and the study of matter is ideal for undergraduate courses.

More information

Reading Passage. Darwin's Theory of Evolution - The Premise

Reading Passage. Darwin's Theory of Evolution - The Premise Darwin's Theory of Evolution - The Premise Reading Passage Darwin's Theory of Evolution is the widely held notion that all life is related and has descended from a common ancestor: the birds and the bananas,

More information

Fisher Information Quantifies Task-Specific Performance in the Blowfly Photoreceptor

Fisher Information Quantifies Task-Specific Performance in the Blowfly Photoreceptor Fisher Information Quantifies Task-Specific Performance in the Blowfly Photoreceptor Peng Xu and Pamela Abshire Department of Electrical and Computer Engineering and the Institute for Systems Research

More information

How the Relationship Between Information Theory and Thermodynamics Can Contribute to Explaining Brain and Cognitive Activity: An Integrative Approach

How the Relationship Between Information Theory and Thermodynamics Can Contribute to Explaining Brain and Cognitive Activity: An Integrative Approach How the Relationship Between Information Theory and Thermodynamics Can Contribute to Explaining Brain and Cognitive Activity: An Integrative Approach Guillem Collell Research Unit in Cognitive Neuroscience,

More information

What s the right answer? Doubt, Uncertainty, and Provisional Truth in Science

What s the right answer? Doubt, Uncertainty, and Provisional Truth in Science What s the right answer? Doubt, Uncertainty, and Provisional Truth in Science Rick Dower and Peter Hyde The Roxbury Latin School IBSC Conference Workshop The City of London School July 2011 What s the

More information

Lesson 1 Syllabus Reference

Lesson 1 Syllabus Reference Lesson 1 Syllabus Reference Outcomes A student Explains how biological understanding has advanced through scientific discoveries, technological developments and the needs of society. Content The theory

More information

Other Organisms (Part 3)

Other Organisms (Part 3) Name: Hour: Teacher: ROZEMA Biology Evolution Unit Addie Bacteria Other Organisms (Part 3) Let s Review What We Know So Far: Natural Selection is There are differences between the Junco birds that live

More information

Bayesian probability theory and generative models

Bayesian probability theory and generative models Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using

More information

MIDDLE SCHOOL BIOLOGY LABORATORY 1ST SEMESTER NAME: DATE: Activity: for each text we will highlight the most important information.

MIDDLE SCHOOL BIOLOGY LABORATORY 1ST SEMESTER NAME: DATE: Activity: for each text we will highlight the most important information. NAME: DATE: TEACHER: Albert Hernandez. GRADE: 2 nd I. Read text carefully and answer the questions bellow. Activity: for each text we will highlight the most important information. The Goal of Science

More information

Learning Predictive Filters

Learning Predictive Filters Learning Predictive Filters Lane McIntosh Neurosciences Graduate Program, Stanford University Stanford, CA 94305 (Dated: December 13, 2013) We examine how a system intent on only keeping information maximally

More information

CE 321 Sample Laboratory Report Packet

CE 321 Sample Laboratory Report Packet CE 321 Sample Laboratory Report Packet This packet contains the following materials to help you prepare your lab reports in CE 321: An advice table with Dr. Wallace s hints regarding common strengths and

More information

The Perceptron. Volker Tresp Summer 2014

The Perceptron. Volker Tresp Summer 2014 The Perceptron Volker Tresp Summer 2014 1 Introduction One of the first serious learning machines Most important elements in learning tasks Collection and preprocessing of training data Definition of a

More information

Transformation of stimulus correlations by the retina

Transformation of stimulus correlations by the retina Transformation of stimulus correlations by the retina Kristina Simmons (University of Pennsylvania) and Jason Prentice, (now Princeton University) with Gasper Tkacik (IST Austria) Jan Homann (now Princeton

More information

Vector Algebra II: Scalar and Vector Products

Vector Algebra II: Scalar and Vector Products Chapter 2 Vector Algebra II: Scalar and Vector Products ------------------- 2 a b = 0, φ = 90 The vectors are perpendicular to each other 49 Check the result geometrically by completing the diagram a =(4,

More information

Chemistry by Computer. An Overview of the Applications of Computers in Chemistry

Chemistry by Computer. An Overview of the Applications of Computers in Chemistry Chemistry by Computer An Overview of the Applications of Computers in Chemistry Chemistry by Computer An Overview of the Applications of Computers in Chemistry Stephen Wilson Theoretical Chemistry Department

More information

Chapter 1 Chemistry, Matter, and Measurement Opening Essay

Chapter 1 Chemistry, Matter, and Measurement Opening Essay Chapter 1 Chemistry, Matter, and Measurement Opening Essay In April 2003, the US Pharmacopeia, a national organization that establishes quality standards for medications, reported a case in which a physician

More information

August 27, Review of Algebra & Logic. Charles Delman. The Language and Logic of Mathematics. The Real Number System. Relations and Functions

August 27, Review of Algebra & Logic. Charles Delman. The Language and Logic of Mathematics. The Real Number System. Relations and Functions and of August 27, 2015 and of 1 and of 2 3 4 You Must Make al Connections and of Understanding higher mathematics requires making logical connections between ideas. Please take heed now! You cannot learn

More information

Brains and Computation

Brains and Computation 15-883: Computational Models of Neural Systems Lecture 1.1: Brains and Computation David S. Touretzky Computer Science Department Carnegie Mellon University 1 Models of the Nervous System Hydraulic network

More information

STOCHASTIC PROCESSES FOR PHYSICISTS. Understanding Noisy Systems

STOCHASTIC PROCESSES FOR PHYSICISTS. Understanding Noisy Systems STOCHASTIC PROCESSES FOR PHYSICISTS Understanding Noisy Systems Stochastic processes are an essential part of numerous branches of physics, as well as biology, chemistry, and finance. This textbook provides

More information

Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method

Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method 82 Basic Tools and Techniques As discussed, the project is based on mental physics which in turn is the application of Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Standards A complete list of the standards covered by this lesson is included in the Appendix at the end of the lesson.

Standards A complete list of the standards covered by this lesson is included in the Appendix at the end of the lesson. Lesson 8: The History of Life on Earth Time: approximately 45-60 minutes, depending on length of discussion. Can be broken into 2 shorter lessons Materials: Double timeline (see below) Meter stick (to

More information

1/12/2017. Computational neuroscience. Neurotechnology.

1/12/2017. Computational neuroscience. Neurotechnology. Computational neuroscience Neurotechnology https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/ 1 Neurotechnology http://www.lce.hut.fi/research/cogntech/neurophysiology Recording

More information

Earth Life System. An Introduction to the

Earth Life System. An Introduction to the An Introduction to the Earth Life System This undergraduate textbook brings together Earth and biological sciences to explore the co-evolution of the Earth and life over geological time. It examines the

More information

Marr's Theory of the Hippocampus: Part I

Marr's Theory of the Hippocampus: Part I Marr's Theory of the Hippocampus: Part I Computational Models of Neural Systems Lecture 3.3 David S. Touretzky October, 2015 David Marr: 1945-1980 10/05/15 Computational Models of Neural Systems 2 Marr

More information

Genetic Changes Lesson 2 HW

Genetic Changes Lesson 2 HW Guiding Question What theory serves as the basis of what we believe about how evolutionary changes occur? 7 th GRADE SCIENCE Genetic Changes Lesson 2 HW # Name: Date: Homeroom: Jean-Baptiste Lamarck (1744-1829)

More information

Signal, donnée, information dans les circuits de nos cerveaux

Signal, donnée, information dans les circuits de nos cerveaux NeuroSTIC Brest 5 octobre 2017 Signal, donnée, information dans les circuits de nos cerveaux Claude Berrou Signal, data, information: in the field of telecommunication, everything is clear It is much less

More information

The grand theory of astrology

The grand theory of astrology Astrology for Aquarius Sharing our Knowledge Hermetic astrology The grand theory of astrology The Brotherhood of Light The grand theory of astrology The Brotherhood of Light 1 Introduction Astrology was

More information

Statistical models for neural encoding

Statistical models for neural encoding Statistical models for neural encoding Part 1: discrete-time models Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk

More information

Welcome to the Worldwide Integral Calculus textbook; the second textbook from the Worldwide Center of Mathematics.

Welcome to the Worldwide Integral Calculus textbook; the second textbook from the Worldwide Center of Mathematics. 0.1. PREFACE 0.1 Preface Welcome to the Worldwide Integral Calculus textbook; the second textbook from the Worldwide Center of Mathematics. Our goal with this textbook is, of course, to help you learn

More information

Independent Component Analysis

Independent Component Analysis Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity

More information

CISC 3250 Systems Neuroscience

CISC 3250 Systems Neuroscience CISC 3250 Systems Neuroscience Systems Neuroscience How the nervous system performs computations How groups of neurons work together to achieve intelligence Professor Daniel Leeds dleeds@fordham.edu JMH

More information

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One

More information

Chemistry 883 Computational Quantum Chemistry

Chemistry 883 Computational Quantum Chemistry Chemistry 883 Computational Quantum Chemistry Instructor Contact Information Professor Benjamin G. Levine levine@chemistry.msu.edu 215 Chemistry Building 517-353-1113 Office Hours Tuesday 9:00-11:00 am

More information

Sections 18.6 and 18.7 Artificial Neural Networks

Sections 18.6 and 18.7 Artificial Neural Networks Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs. artifical neural

More information

Might have Minkowski discovered the cause of gravity before Einstein? Vesselin Petkov Minkowski Institute Montreal, Canada

Might have Minkowski discovered the cause of gravity before Einstein? Vesselin Petkov Minkowski Institute Montreal, Canada Might have Minkowski discovered the cause of gravity before Einstein? Vesselin Petkov Minkowski Institute Montreal, Canada OUTLINE We will never know how physics would have developed had Hermann Minkowski

More information

Direct Proof and Counterexample I:Introduction

Direct Proof and Counterexample I:Introduction Direct Proof and Counterexample I:Introduction Copyright Cengage Learning. All rights reserved. Goal Importance of proof Building up logic thinking and reasoning reading/using definition interpreting :

More information

3/30/2012. Two Contrasting but Complementary Evolutionary Perspectives on Human Behavior:

3/30/2012. Two Contrasting but Complementary Evolutionary Perspectives on Human Behavior: Two Contrasting but Complementary Perspectives on Human Behavior: Psychology (EP) derived from a synthesis of biology and psychology Human Behavioral Ecology (HEB) derived from a synthesis of biology and

More information

Hubble Space Telescope

Hubble Space Telescope Before the first telescopes were invented at the beginning of the 17th century, people looked up at the stars with their naked eyes. The first refracting telescope that Galileo developed in 1609 was likely

More information

6.02 Fall 2012 Lecture #1

6.02 Fall 2012 Lecture #1 6.02 Fall 2012 Lecture #1 Digital vs. analog communication The birth of modern digital communication Information and entropy Codes, Huffman coding 6.02 Fall 2012 Lecture 1, Slide #1 6.02 Fall 2012 Lecture

More information

The Robustness of Stochastic Switching Networks

The Robustness of Stochastic Switching Networks The Robustness of Stochastic Switching Networks Po-Ling Loh Department of Mathematics California Institute of Technology Pasadena, CA 95 Email: loh@caltech.edu Hongchao Zhou Department of Electrical Engineering

More information

HS AP Physics 1 Science

HS AP Physics 1 Science Scope And Sequence Timeframe Unit Instructional Topics 5 Day(s) 20 Day(s) 5 Day(s) Kinematics Course AP Physics 1 is an introductory first-year, algebra-based, college level course for the student interested

More information

Direct Proof and Counterexample I:Introduction. Copyright Cengage Learning. All rights reserved.

Direct Proof and Counterexample I:Introduction. Copyright Cengage Learning. All rights reserved. Direct Proof and Counterexample I:Introduction Copyright Cengage Learning. All rights reserved. Goal Importance of proof Building up logic thinking and reasoning reading/using definition interpreting statement:

More information

The Perceptron. Volker Tresp Summer 2016

The Perceptron. Volker Tresp Summer 2016 The Perceptron Volker Tresp Summer 2016 1 Elements in Learning Tasks Collection, cleaning and preprocessing of training data Definition of a class of learning models. Often defined by the free model parameters

More information

Nonlinear reverse-correlation with synthesized naturalistic noise

Nonlinear reverse-correlation with synthesized naturalistic noise Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California

More information

Astronomy with a Budget Telescope

Astronomy with a Budget Telescope Astronomy with a Budget Telescope Springer-Verlag London Ltd. Patrick Moore and John Watson Astro omy w h a Budget elescope With 100 Figures, 98 in colour, Springer British Library Cataloguing in Publication

More information

Searching for simple models

Searching for simple models Searching for simple models 9th Annual Pinkel Endowed Lecture Institute for Research in Cognitive Science University of Pennsylvania Friday April 7 William Bialek Joseph Henry Laboratories of Physics,

More information

All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model

All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model S. A. Sadegh Zadeh, C. Kambhampati International Science Index, Mathematical and Computational Sciences waset.org/publication/10008281

More information

Matter and Motion. Written by Edward Shevick Illustrated by Marguerite Jones. Teaching & Learning Company. Teaching & Learning Company

Matter and Motion. Written by Edward Shevick Illustrated by Marguerite Jones. Teaching & Learning Company. Teaching & Learning Company Matter and Motion Written by Edward Shevick Illustrated by Marguerite Jones Teaching & Learning Company Teaching & Learning Company a Lorenz company P.O. Box 802, Dayton, OH 45401-0802 www.lorenzeducationalpress.com

More information

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons To appear in: Neural Information Processing Systems (NIPS), http://nips.cc/ Granada, Spain. December 12-15, 211. Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Yan

More information

A Brief Introduction to Proofs

A Brief Introduction to Proofs A Brief Introduction to Proofs William J. Turner October, 010 1 Introduction Proofs are perhaps the very heart of mathematics. Unlike the other sciences, mathematics adds a final step to the familiar scientific

More information