Historical Perspectives John von Neumann ( )

Similar documents
Information in Biology

Information in Biology

Automata Theory. Definition. Computational Complexity Theory. Computability Theory

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska

Theory of Computation CS3102 Spring 2014 A tale of computers, math, problem solving, life, love and tragic death

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 14 Rosser s Theorem, the length of proofs, Robinson s Arithmetic, and Church s theorem. Michael Beeson

PHYS Statistical Mechanics I Course Outline


Great Ideas in Theoretical Computer Science

6.02 Fall 2011 Lecture #9

6.080 / Great Ideas in Theoretical Computer Science Spring 2008

CS 275 Automata and Formal Language Theory

CS 275 Automata and Formal Language Theory

Fundamentals of Computer Science

10. Finite Automata and Turing Machines

The roots of computability theory. September 5, 2016

Computer Science and Logic A Match Made in Heaven

Decidable Languages - relationship with other classes.

MAA509: Quantum Computing and Information Introduction

Introduction to Turing Machines

Part I. Entropy. Information Theory and Networks. Section 1. Entropy: definitions. Lecture 5: Entropy

Limits of Computation

The Turing machine model of computation

! Chris Diorio. ! Gaetano Borrielo. ! Carl Ebeling. ! Computing is in its infancy

Cellular automata are idealized models of complex systems Large network of simple components Limited communication among components No central

CS187 - Science Gateway Seminar for CS and Math

Handouts. CS701 Theory of Computation

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Opleiding Informatica

Statistical Structure in Natural Language. Tom Jackson April 4th PACM seminar Advisor: Bill Bialek

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

6-1 Computational Complexity

Thermodynamics Second Law Entropy

Self-reproducing programs. And Introduction to logic. COS 116, Spring 2012 Adam Finkelstein

Regular Languages and Finite Automata

A Potpourri of Nonlinear Algebra

Limits of Computation. Antonina Kolokolova

highlights proof by contradiction what about the real numbers?

Chapter 1. Introduction

Models. Models of Computation, Turing Machines, and the Limits of Turing Computation. Effective Calculability. Motivation for Models of Computation

Turing Machines. COMP2600 Formal Methods for Software Engineering. Katya Lebedeva. Australian National University Semester 2, 2014

Final exam study sheet for CS3719 Turing machines and decidability.

Lecture 1: Shannon s Theorem

Thermodynamics: More Entropy

Lecture 1: Introduction to Quantum Computing

There s still plenty of room at the bottom

Properties of Arithmetic

CS 301. Lecture 17 Church Turing thesis. Stephen Checkoway. March 19, 2018

COMP-330 Theory of Computation. Fall Prof. Claude Crépeau. Lec. 16 : Turing Machines

complex networks and systems i609: advanced PhD seminar I Informatics luis rocha 2019 INDIANA UNIVERSITY

The Pumping Lemma. for all n 0, u 1 v n u 2 L (i.e. u 1 u 2 L, u 1 vu 2 L [but we knew that anyway], u 1 vvu 2 L, u 1 vvvu 2 L, etc.

Lecture 18: Quantum Information Theory and Holevo s Bound

CS154, Lecture 18: 1

Turing Machines. Lecture 8

Welcome to Comp 411! 2) Course Objectives. 1) Course Mechanics. 3) Information. I thought this course was called Computer Organization

Announcements. Final exam review session tomorrow in Gates 104 from 2:15PM 4:15PM Final Friday Four Square of the quarter!

An Entropy Bound for Random Number Generation

Computation. Some history...

Theory of Computation CS3102 Spring 2014

Turing Machines (TM) The Turing machine is the ultimate model of computation.

What are the recursion theoretic properties of a set of axioms? Understanding a paper by William Craig Armando B. Matos

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Most General computer?

Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Church s undecidability result

Mathematics, Proofs and Computation

11. Automata and languages, cellular automata, grammars, L-systems

CISC 876: Kolmogorov Complexity

UNIT I INFORMATION THEORY. I k log 2

Contents. Chapter 1 Schrödinger s Cat 1. Chapter 2 Size Is Absolute 8. Chapter 3 Some Things About Waves 22. Chapter 4

Classification & Information Theory Lecture #8

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Theory of Computation CS3102 Spring 2014 A tale of computers, math, problem solving, life, love and tragic death

Kolmogorov complexity and its applications

FINAL EXAM FOR PHIL 152 MICHAEL BEESON

Thermodynamics: More Entropy

Turing Machines Part Two

Announcements. Using a 72-hour extension, due Monday at 11:30AM.

The Legacy of Hilbert, Gödel, Gentzen and Turing

Theory of Computation

Entropies & Information Theory

CSE 105 Theory of Computation

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Computer Sciences Department

A Mathematical Theory of Communication

1 Computational problems

von Neumann Architecture

Complexity Theory. Ahto Buldas. Introduction September 10, Slides based on S.Aurora, B.Barak. Complexity Theory: A Modern Approach.

Announcements. Problem Set 6 due next Monday, February 25, at 12:50PM. Midterm graded, will be returned at end of lecture.

Intro To Digital Logic

Lecture 6: Introducing Complexity

COMP-330 Theory of Computation. Fall Prof. Claude Crépeau. Lec. 14 : Turing Machines

P vs NP & Computational Complexity

INTRO TO MATH PROGRAMMING MA/OR 504, FALL 2015 LECTURE 11

Turing Machines Part One

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

Theory of Computation. Theory of Computation

Transcription:

Historical Perspectives John von Neumann (1903-1957) Contributed to set theory, functional analysis, quantum mechanics, ergodic theory, economics, geometry, hydrodynamics, statistics, analysis, measure theory, ballistics, meteorology, Invented game theory (used in Cold War) Re-axiomatized set theory Principal member of Manhattan Project Helped design the hydrogen / fusion bomb Pioneered modern computer science Originated the stored program von Neumann architecture and bottleneck Helped design & build the EDVAC computer Created the field of cellular automata Investigated self-replication Invented merge sort

Ulam Feynman "Most mathematicians prove what they can; von Neumann proves what he wants."

von Neumann s Legacy Re-axiomatized set theory to address Russell s paradox Independently proved Godel s second incompleteness theorem: aximomatic systems are unable to prove their own consistency. Addressed Hilbert s 6 th problem: axiomatized quantum mechanics using Hilbert spaces. Developed the game-theory based Mutually-Assured Destruction (MAD) strategic equilibrium policy still in effect today! von Neumann regular rings, von Neumann bicommutant theorem, von Neumann entropy, von Neumann programming languages Language Architecture variables storage control test-and-set assignment fetch/store expressions memory refs & arithmetic

CPU Memory Von Neumann Architecture Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the Von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it. - John Backus, 1977 ACM Turing Award lecture More bottlenecks Functional programming Disk Internet

EDVAC (1945): 1024 words (44-bits) 5.5KB 864 microsec / add (1157 / sec) 2900 microsec / multiply (345/sec) Magnetic tape (no disk), oscilloscope 6,000 vacuum tubes 56,000 Watts of power 17,300 lbs (7.9 tons), 490 sqft 30 people to operate Cost: $500,000

Self-Replication Biology / DNA Nanotechnology Computer viruses Space exploration Memetics / memes Gray goo Self-replicating cellular automata designed by von Neumann Problem (extra credit): write a program that prints out its own source code (no inputs of any kind are allowed).

In mathematics you don't understand things. You just get used to them. John von Neumann

Historical Perspectives Claude Shannon (1916-2001) Invented electrical digital circuits (1937) Founded information theory (1948) Introduced sampling theory, coined term bit Contributed to genetics, cryptography Joined Institute for Advanced Study (1940) Influenced by Turing, von Neumann, Einstein Originated information entropy, Nyquist Shannon, sampling theorem, Shannon-Hartley theorem, Shannon switching game, Shannon-Fano coding, Shannon s source coding theorem, Shannon limit, Shannon decomposition / expansion, Shannon # Other hobbies & inventions: juggling, unicycling, computer chess, rockets, motorized pogo stick, flame-throwers, Rubik's cube solver, wearable computer, mathematical gambling, stock markets AT&T Shannon Labs

Chess champion Ed Lasker looking at Shannon s chess-playing machine Theseus: Shannon s electro-mechanical mouse (1950): first learning machine and AI experiment

Shannon s home study room Shannon s On/Off machine

Entropy and Randomness Entropy measures the expected uncertainly (or surprise ) associated with a random variable. Entropy quantifies the information content and represents a lower bound on the best possible lossless compression. Ex: a random fair coin has entropy of 1 bit. A biased coin has lower entropy than fair coin. A two-headed coin has zero entropy. The string 00000000000000 has zero entropy. English text has entropy rate of 0.6 to 1.5 bits per letter. Q: How do you simulate a fair coin with a biased coin of unknown but fixed bias? A [von Neumann]: Look at pairs of flips. HT and TH both occur with equal probability of p(1-p), and ignore HH and TT pairs.

Entropy and Randomness Information entropy is an analogue of thermodynamic entropy in physics / statistical mechanics, and von Neumann entropy in quantum mechanics. Second law of thermodynamics: entropy of an isolated system can not decrease over time. Entropy as disorder or chaos. Entropy as the arrow of time. Heat death of the universe / black holes Quantum computing uses a quantum information theory to generalize classical information theory. Theorem: String compressibility decreases as entropy increases. Theorem: Most strings are not (losslessly) compressible. Corollary: Most strings are random!

My greatest concern was what to call it. I thought of calling it information, but the word was overly used, so I decided to call it uncertainty. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. - Claude Shannon on his conversation with John von Neumann regarding what name to give to the measure of uncertainty or attenuation in phone-line signals (1949)

Historical Perspectives Stephen Kleene (1909-1994) Founded recursive function theory Pioneered theoretical computer science Student of Alonzo Church; was at the Institute for Advanced Study (1940) Invented regular expressions Kleene star / closure, Kleene algebra, Kleene recursion theorem, Kleene fixed point theorem, Kleene-Rosser paradox Kleeneliness is next to Gödeliness

Historical Perspectives Noam Chomsky (1928-) Linguist, philosopher, cognitive scientist, political activist, dissident, author Father of modern linguistics Pioneered formal languages Developed generative grammars Invented context-free grammars Defined the Chomsky hierarchy Influenced cognitive psychology, philosophy of language and mind Chomskyan linguistics, Chomskyan syntax, Chomskyan models Critic of U.S. foreign policy Most widely cited living scholar Eighth most-cited source overall!

I must admit to taking a copy of Noam Chomsky's Syntactic Structures along with me on my honeymoon in 1961 Here was a marvelous thing: a mathematical theory of language in which I could use as a computer programmer's intuition! - Don Knuth on Chomsky s influence