Nanotechnology-inspired Information Processing Systems of the Future

Similar documents
UNIT I INFORMATION THEORY. I k log 2

Example: sending one bit of information across noisy channel. Effects of the noise: flip the bit with probability p.

The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Lecture 8: Shannon s Noise Models

ECE Information theory Final (Fall 2008)

Digital communication system. Shannon s separation principle

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Chapter 9 Fundamental Limits in Information Theory

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Lecture 11: Quantum Information III - Source Coding

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

DESIGN AND IMPLEMENTATION OF ENCODERS AND DECODERS. To design and implement encoders and decoders using logic gates.

INFORMATION-THEORETIC BOUNDS OF EVOLUTIONARY PROCESSES MODELED AS A PROTEIN COMMUNICATION SYSTEM. Liuling Gong, Nidhal Bouaynaya and Dan Schonfeld

1 Background on Information Theory

Graph-based codes for flash memory

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

Lecture 22: Final Review

Massachusetts Institute of Technology

Chapter 2 Review of Classical Information Theory

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

Entropies & Information Theory

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Shannon s noisy-channel theorem

The Method of Types and Its Application to Information Hiding

On the Secrecy Capacity of Fading Channels

Channel Coding and Interleaving

CSCI 2570 Introduction to Nanocomputing

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

PCM Reference Chapter 12.1, Communication Systems, Carlson. PCM.1

ENGIN 112 Intro to Electrical and Computer Engineering


Information and Entropy

Fundamentals of Information Theory Lecture 1. Introduction. Prof. CHEN Jie Lab. 201, School of EIE BeiHang University

One Lesson of Information Theory

MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING

Energy State Amplification in an Energy Harvesting Communication System

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory (Information Theory by J. V. Stone, 2015)

Shannon s A Mathematical Theory of Communication

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Compression and Coding

Information Capacity of an Energy Harvesting Sensor Node

Principles of Communications

Introduction to Information Theory

A Mathematical Theory of Communication

L. Yaroslavsky. Fundamentals of Digital Image Processing. Course

Block 2: Introduction to Information Theory

Reliable Computation over Multiple-Access Channels

Introduction to Information Theory. Part 4

Logic. Combinational. inputs. outputs. the result. system can

Lecture 12. Block Diagram

An Introduction to (Network) Coding Theory

An Introduction to Low Density Parity Check (LDPC) Codes

X 1 : X Table 1: Y = X X 2

Information Theoretic Imaging

ECEN 655: Advanced Channel Coding

Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View

Polynomial Codes over Certain Finite Fields

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

ECE Information theory Final

Codes on graphs and iterative decoding

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Quantum Teleportation Pt. 1

Energy-Reliability Limits in Nanoscale Circuits

(Classical) Information Theory III: Noisy channel coding

Chapter 4. Combinational: Circuits with logic gates whose outputs depend on the present combination of the inputs. elements. Dr.

Shannon and my research

Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

Analyzing Neuroscience Signals using Information Theory and Complexity

APPLICATIONS. Quantum Communications

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Lecture 1. Introduction

Communication Theory II

6.02 Fall 2011 Lecture #9

Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems

Dr. Cathy Liu Dr. Michael Steinberger. A Brief Tour of FEC for Serial Link Systems

Classical and Quantum Channel Simulations

Upper Bounds on the Capacity of Binary Intermittent Communication

Low complexity state metric compression technique in turbo decoder

Multiple Description Coding for quincunx images.

Multimedia Communications. Scalar Quantization

Capacity Region of the Permutation Channel

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

arxiv: v1 [cs.sy] 30 Sep 2015

On Network Interference Management

Codes on graphs and iterative decoding

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

The Gallager Converse

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

CS Lecture 4. Markov Random Fields

Lecture 4 Channel Coding

4 An Introduction to Channel Coding and Decoding over BSC

CprE 281: Digital Logic

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN

On Source-Channel Communication in Networks

Transcription:

Nanotechnology-inspired Information Processing Systems of the Future Lav R. Varshney University of Illinois at Urbana-Champaign August 31, 2016 Cross-cutting Panel 3

Putting intelligence in (trillions of) things, wearables, and phones On-device Augmented Intelligence Grounded in our physical environment Operating at the speed of thought Running perpetually MITIGATE INFORMATION OVERLOAD (ATTENTION AND CONTEXT-DEPENDENT PRIORITY) FACILITATE CREATIVITY (HETEROASSOCIATIVE MEMORY) SOLVE PROBLEMS (BEYOND CLASSIFICATION) Augment human intelligence by providing us with just the right amount of contextual information when we need it [Image courtesy of Nic Lane, Bell Labs]

Spices, silks, alloys, drug cocktails, science, military strategy/tactics Creativity is the generation of artifacts or ideas that are novel and high-quality [Joint Force Quarterly] [IEEE Spectrum]

A current approach to connect nanotech with applications Devices Circuits/Architectures Systems

A Shannon-inspired approach to connect nanotech with applications CHARACTERIZATION Devices DESIDERATA Circuits/Architectures Systems Theory

How can theory help us in matching application with device? APPLICATION APPLICATION DEVICE (Shannon, 1948) Specificity Sometimes applications are perfectly matched to devices uncoded, circuit-free, delay-free transmission is optimal Generality Always applications can be mapped to intermediate representation to become matched to devices block codes allow any application to run on any device via bit representation

Theory of Specificity: matching condition Theorem An information system with a source p U u and a channel p Y X y x is perfectly matched (i.e. optimal with uncoded transmission) if the basic physical resource of the technology satisfies: b x = D p Y X x p Y and the basic notion of fidelity for the application satisfies: d u, v = log p U V u v Interpretation If Bayesian surprise (cf. creativity) is the basic physical resource of the device and logarithmic loss (cf. inference) is the basic notion of fidelity in the application, perfect matching follows. Redundancy in source is exactly redundancy needed to protect against channel Gaussian source over power-constrained AWGN channel under quadratic distortion

Theory of Generality: intermediate representation and architecture (Fano, 1961) Dear Prof. Varshney, Fig. 1.1 is used to specify what part of the process of communication the textbook is about, namely the transmission of messages chosen from a finite set (representable by binary numbers). In other words, the book is about the boxes labeled "channel encoder" and "channel decoder". The fact that the set of possible messages is finite is basic to information theory, and sometime forgotten. Best wishes for 2014, Bob Fano

Predictions of synaptic microarchitecture Experimentally verified predictions Stochastic characterization of device Optimize storage capacity per unit volume Synaptic connectivity should be sparse Synapses should be small and noisy on average Heavy-tailed synaptic strength distribution Synapses may be discrete-valued (Matsazuki et al., 2001) (Murthy et al., 2001) Relationship between volume and efficacy

Need stochastic benchmarking of nanotech devices to connect to apps Error-delay-energy tradeoff of spintronic logic Limit theorems and optimal architectural principles using information-theoretic optimization Heterogeneous energy allocation is very powerful! [Patil, Shanbhag, Manipatruni, Nikonov, and Young] Ferroelectric FET nanofunctions Rules of thumb for circuit design and specific design algorithms for actual design [Khan, Chatterjee, Duarte, Lu, Sachid, Khandelwal, Ramesh, Hu, Salahuddin]

A Shannon-inspired approach to connect nanotech with applications CHARACTERIZATION Devices What every engineer needs is a good set of limit theorems DESIDERATA Circuits/Architectures Systems Theory FUNDAMENTAL LIMITS [Spielberg, 1989]