The important thing is not to stop questioning! Curiosity has its own reason for existing. (Albert Einstein) Noise-based logic

Similar documents
BIRD S-EYE VIEW ON NOISE-BASED LOGIC

COMPLEX NOISE-BITS AND LARGE-SCALE INSTANTANEOUS PARALLEL OPERATIONS WITH LOW COMPLEXITY

Noise-based deterministic logic and computing: a brief survey. Department of Electrical and Computer Engineering, Texas A&M University, USA

(1) Department of Electrical Engineering, Texas A&M University, College Station, TX , USA;

DEMONS: MAXWELL S DEMON, SZILARD S ENGINE AND LANDAUER S ERASURE DISSIPATION

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

Thermal noise driven computing

Demons: Maxwell demon; Szilard engine; and Landauer's erasure-dissipation

Noise-based logic: Binary, multi-valued, or fuzzy, with optional superposition of logic states

Zero and negative energy dissipation at information-theoretic erasure

PHYSICAL UNCLONEABLE FUNCTION HARDWARE KEYS UTILIZING KIRCHHOFF-LAW- JOHNSON-NOISE SECURE KEY EXCHANGE AND NOISE-BASED LOGIC

ANALYSIS OF THE ATTENUATOR-ARTIFACT IN THE EXPERIMENTAL ATTACK OF GUNN-ALLISON-ABBOTT AGAINST THE KLJN SYSTEM

Lognormal distribution of firing time and rate from a single neuron?

Information entropy and thermal entropy: apples and oranges

SPECTRA FOR THE PRODUCT OF GAUSSIAN NOISES. Laszlo Bela Kish 1), Robert Mingesz 2), Zoltan Gingl 2), Claes-Goran Granqvist 3)

Texas A&M University, Department of Electrical and Computer Engineering, College Station, TX , USA

Does Information Have Mass?

Response to Comment on Zero and negative energy dissipation at information-theoretic erasure

Does a standalone, cold (low-thermal-noise), linear resistor exist without cooling?

Generalized Kirchhoff-Law-Johnson- Noise (KLJN) secure key exchange system using arbitrary resistors

Intro To Digital Logic

Lognormal distribution of firing time and rate from a single neuron?

Cs302 Quiz for MID TERM Exam Solved

A Thermodynamic Turing Machine: Artificial Molecular Computing Using Classical Reversible Logic Switching Networks [1]

versions 9/27/2015, 10/1/

Thermal Noise Driven Heat Engines

Scaling of MOS Circuits. 4. International Technology Roadmap for Semiconductors (ITRS) 6. Scaling factors for device parameters

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING QUESTION BANK

Lecture 6: Time-Dependent Behaviour of Digital Circuits

Submitted for publication on October 23, Version 3, October 24,

CMOS Logic Gates. University of Connecticut 181

MACHINE COMPUTING. the limitations

Lecture 23. CMOS Logic Gates and Digital VLSI I

Low Power VLSI Circuits and Systems Prof. Ajit Pal Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur

Lecture 15: Scaling & Economics

Adders, subtractors comparators, multipliers and other ALU elements

Capacitors. Chapter How capacitors work Inside a capacitor

CMOS Logic Gates. University of Connecticut 172

Figure 1.1: Schematic symbols of an N-transistor and P-transistor

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

Chapter 2 Fault Modeling

Q. 1 Q. 25 carry one mark each.

Computer Science 324 Computer Architecture Mount Holyoke College Fall Topic Notes: Digital Logic

Lecture 7: Logic design. Combinational logic circuits

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

CMPE12 - Notes chapter 2. Digital Logic. (Textbook Chapters and 2.1)"

The theory and practice of Probabilistic CMOS

of Digital Electronics

Bits. Chapter 1. Information can be learned through observation, experiment, or measurement.

The Design Procedure. Output Equation Determination - Derive output equations from the state table

Time Varying Circuit Analysis

Topic 4. The CMOS Inverter

Magnetic tunnel junction beyond memory from logic to neuromorphic computing WANJUN PARK DEPT. OF ELECTRONIC ENGINEERING, HANYANG UNIVERSITY

ECE/CS 250 Computer Architecture

ECEN 248: INTRODUCTION TO DIGITAL SYSTEMS DESIGN. Week 9 Dr. Srinivas Shakkottai Dept. of Electrical and Computer Engineering

MITOCW 6. Standing Waves Part I

Information Entropy Theory of Physics

ECE 546 Lecture 10 MOS Transistors

Quantum Computing. Vraj Parikh B.E.-G.H.Patel College of Engineering & Technology, Anand (Affiliated with GTU) Abstract HISTORY OF QUANTUM COMPUTING-

Chapter 1 :: From Zero to One

Lecture 22: Quantum computational complexity

ECE 250 / CPS 250 Computer Architecture. Basics of Logic Design Boolean Algebra, Logic Gates

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

MM74C912 6-Digit BCD Display Controller/Driver

Binary addition example worked out

published in: Metrology and Measurement Systems. Volume XX, Issue 2, Pages

Post Von Neumann Computing

Good morning everyone, and welcome again to MSLs World Metrology Day celebrations.

Clock-Gating and Its Application to Low Power Design of Sequential Circuits

Boolean circuits. Lecture Definitions

ECE321 Electronics I

Interconnect s Role in Deep Submicron. Second class to first class

A NEW DESIGN TECHNIQUE OF REVERSIBLE GATES USING PASS TRANSISTOR LOGIC

Case Studies of Logical Computation on Stochastic Bit Streams

MA 3260 Lecture 10 - Boolean Algebras (cont.) Friday, October 19, 2018.

EE115C Winter 2017 Digital Electronic Circuits. Lecture 19: Timing Analysis

Exploring Autonomous Memory Circuit Operation

ECE/CS 250: Computer Architecture. Basics of Logic Design: Boolean Algebra, Logic Gates. Benjamin Lee

UNIVERSITY OF MASSACHUSETTS LOWELL DEPARTMENT OF ELECTRICAL & COMPUTER ENGINEERING SYLLABUS FOR THE DOCTORAL QUALIFYING EXAM

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note Introduction: Op-amps in Negative Feedback

Module - 19 Gated Latches

Thanks to: University of Illinois at Chicago NSF DMS Alfred P. Sloan Foundation

Digital Logic. Lecture 5 - Chapter 2. Outline. Other Logic Gates and their uses. Other Logic Operations. CS 2420 Husain Gholoom - lecturer Page 1

Design and Implementation of Carry Adders Using Adiabatic and Reversible Logic Gates

Designing Information Devices and Systems II Fall 2017 Miki Lustig and Michel Maharbiz Homework 1. This homework is due September 5, 2017, at 11:59AM.

Every time has a value associated with it, not just some times. A variable can take on any value within a range

Simple Interactions CS 105 Lecture 2 Jan 26. Matthew Stone

Reducing power in using different technologies using FSM architecture

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Electronic Circuits Summary

ELEC Digital Logic Circuits Fall 2014 Sequential Circuits (Chapter 6) Finite State Machines (Ch. 7-10)

Where Does Power Go in CMOS?

Digital Systems Overview. Unit 1 Numbering Systems. Why Digital Systems? Levels of Design Abstraction. Dissecting Decimal Numbers

Digital Electronics. Part A

Lecture 10: Synchronous Sequential Circuits Design

Machine Learning. Neural Networks

CMPEN 411 VLSI Digital Circuits Spring 2012 Lecture 17: Dynamic Sequential Circuits And Timing Issues

Analog Integrated Circuit Design Prof. Nagendra Krishnapura Department of Electrical Engineering Indian Institute of Technology, Madras

Theory of Self-Reproducing Automata John Von Neumann. Joshua Ridens and David Shubsda

pickup from external sources unwanted feedback RF interference from system or elsewhere, power supply fluctuations ground currents

Transcription:

The important thing is not to stop questioning! Curiosity has its own reason for existing. (Albert Einstein) Noise-based logic Don t expect a complete or systematic talk (no time); rather something to challenge/explore Comments, collaboration are welcome!

The important thing is not to stop questioning! Curiosity has its own reason for existing. (Albert Einstein) Noise-based logic: Why noise for deterministic logic? He Wen (1,2) and Laszlo B. Kish (1) (1) Department of Electrical and Computer Engineering, Texas A&M University, College Station (2) Hunan University, College of Electrical and Information Engineering, Changsha, 410082, China Although noise-based logic shows potential advantages of reduced power dissipation and the ability of large parallel operations with low hardware and time complexity the question still persist: is randomness really needed out of orthogonality? In this talk after introducing noise-based logic we address this question. A journal paper about this issue is coming out in the December issue of Fluctuation and Noise Letters http://www.ece.tamu.edu/~noise/research_files/noise_based_logic.htm Presented at: ICCAD 2012, SPECIAL SESSION: Computing in the Random Noise: The Bad, the Good, and the Amazing Grace November 5, 2012, San Jose, CA. Texas A&M University, Department of Electrical and Computer Engineering

The important thing is not to stop questioning! Curiosity has its own reason for existing. (Albert Einstein) Why is neural spike transfer stochastic? String verification in the brain Laszlo B. Kish 1, Sergey M. Bezrukov 2, Tamas Horvath 3,4, Claes-Göran Granqvist 5 1 Texas A&M University, Department of Electrical Engineering, College Station, TX 77843-3128, USA; 2 Laboratory of Physical and Structural Biology, Program in Physical Biology, NICHD, National Institutes of Health, Bethesda, MD 20892, USA; 3 Fraunhofer IAIS, Schloss Birlinghoven, D-53754 Sankt Augustin, Germany; 4 Department of Computer Science, University of Bonn, Germany; 5 Department of Engineering Sciences, The Ångström Laboratory, Uppsala University, P.O. Box 534, SE-75121 Uppsala, Sweden. The 4th International Conference on Cognitive Neurodynamics, 23-27 June 2013, Sigtuna, Sweden Texas A&M University, Department of Electrical and Computer Engineering

Present and past collaborators on noisebased logic (Alphabetical order). Sergey Bezrukov (NIH): brain: logic scheme, information processing/routing, circuitry, etc. Khalyan Bollapalli (former computer engineering PhD student, TAMU): exploration of sinusoidal orthogonal logic Zoltan Gingl (Univ. of Szeged, Hungary): modeling for circuit realization, etc. Tamas Horvath (Frauenhofer for Computer Science, Bonn, Germany): string verification, Hamilton coloring problem. Sunil Khatri, (Computer Engineering, TAMU): hyperspace, squeezed instantaneous logic, etc. Andreas Klappenecker, (Computer Science, TAMU): quantum-mimicking, large complexity instantaneous parallel operations, etc. Ferdinand Peper (Kobe Research Center, Japan): squeezed and non-squeezed instantaneous logic, etc. Swaminathan Sethuraman (former math. PhD student, TAMU): Achilles heel operation. He Wen (Electrical Engineering, TAMU; Visiting Scholar from Hunan University, China): large complexity instantaneous parallel operations; why noise; complex noise-based logic, etc. "noise-based logic is one of the most ambitious attempts..."

The microprocessor problem Speed-Error-Power triangle

Model-picture of speed and dissipation versus miniaturization (LK, PLA, 2002) A switch is a potential barrier which exists (off position) or not (on position). To control/build the potential barrier we need energy. Maximal clock frequency Dissipation by a single unit Total dissipation by the chip f 0 (RC) 1 P 1 f 0 E 1 (RC) 1 CU 2 0 U 2 0 R P N NU 0 2 /R NU 0 2 U 0 2 /s 2 s : characteristic device size number of units N 1 s 2 CMOS drivers' channel resistance R 1 2 C CMOS gate capacitance C s 2 U 0 C s

False bit flips. Gaussian noise can reach an arbitrarily great amplitude during a long-enough period of time and the rms noise voltage grows with miniaturization: U n = kt C A m p l i t u d e time U signal (t) Same as the thermal activation formula, however, here we know the mean attempt frequency more accurately. For band-limited white noise, frequency band (0, f c ), the threshold crossing frequency is: U H U L 0 1 1 U 0 (power supply voltage) 0 0 0 0 Clock generator events 1 1 Time ν(u th ) = 2 3 exp U 2 th 2 2U n f c U where n = S(0) f c

L.B. Kish, "End of Moore's Law; Thermal (Noise) Death of Integration in Micro and Nano Electronics", Phys. Lett. A., 305 (2002) 144 149 L.B. Kish, "Moore's Law and the Energy Requirement of Computing versus Performance", IEE Proc. - Circ. Dev. Syst. 151 (2004) 190-194. Energy dissipation of single logic operation at ε error probility: E > kt ln 1 ε Practical situation is much worse; prediction in 2002-2003: It was supposed that: The bandwidth is utilized; The supply voltage is reduced proportionally with size (to control energy dissipation and avoid early failure due to hot electrons. Noise margin, V 1 0.1 Required noise margin, new Speed-Error-Power ε < 10 25 E 60kT Actual noise margin, old Actual noise margin, new Size, nm Required noise margin, old 10 100

November 2002 January 2003 Conclusion was (2002): if the miniaturization is continuing below 30-40 nm, then the clock frequency cannot be increased. No increase since 2003! Prophecy fulfilled much earlier! Even though Moore's law has seemingly been followed, the speed of building elements are not utilized. Supply voltage has been kept high.

Comparison of single quantum gates with single classical logic gates L.B. Kish, "Moore's Law and the Energy Requirement of Computing versus Performance", IEE Proc. - Circ. Dev. Syst., 2004. Power dissipation of single gate (W) 10 15 10 12 10 9 10 6 10 3 10 0 10-3 10-6 10-9 10-12 10-15 Quantum gates CMOS gates Power, CMOS, 3GHz Power, CMOS, 20GHz Power, quantum, 3GHz Power, quantum, 20GHz Gea-Banacloche, Phys.Rev.Lett. 2002 10-31 10-26 10-21 10-16 10-11 10-6 Error ratio ε Max. of total chip power today 10-1

The brain vs computer dreams and reality

In the "Blade Runner" movie (made in 1982) in Los Angeles, at 2019, the Nexus-6 robots are more intelligent than average humans. 2019 is only 6 years from now and nowadays we have been observing the slowdown of the evolution of computer chip performance. We are simply nowhere compared a Nexus-6. Have we missed the noisy neural spikes in our computer developments???

Isaac Asimov (1950's): The Three Laws of Robotics: 1. A robot may not injure a human being, or, through inaction, allow a human to come to harm. 2. A robot must obey orders given to him by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Not even the best supercomputers are able to address such refined perception of situations! We have great problems even with the most elementary necessities, such as recognition of natural speech of arbitrary people or speech in background noise.

How does biology do it??? A quick comparison. Note: Average power consumption of a supercomputer in the worldwide TOP-10 list (2012) is 1.32 million Watts. This Laptop Human Brain Power dissipation: about 12 W Brain dissipation: about 12 W Number of switches (transistors): 10 13 Number of switches (neurons): 10 11 Very high bandwidth (GHz range) Extremely low bandwidth (< 100 Hz) Signal: deterministic, binary voltage Signal: stochastic spike train, noise Deterministic binary logic scheme, general-purpose(?) Unknown logic scheme, special-purpose (???) Potential-well based, addressed memory Unknown, associative memory High speed of fast, primitive operations Slow but intelligent operations Low probability of errors High probability of errors, even with simple operations Sensitive for operational errors (freezing) Error robust (no freezing) (?)

Δ =1/ n Often a Poisson-like spike sequence. The relative frequency-error scales as the reciprocal of the square-root of the number of spikes. Supposing the maximal frequency, 100 Hz, of spike trains, 1% error needs to count 10 4 spikes, which is 100 seconds of averaging! Pianist playing with 10 Hz hit rate would have 30% error in the rhythm at the point of brain control. Parallel channels needed, at least 100 of them. (Note: controlling the actual muscles is also a problem of negative feedback but we need an accurate reference signal). Let's do the naive math: similar number of neurons and transistors in a palmtop, but 30 million times slower clock; plus a factor of 10 4 slowing down due to averaging needed by the stochastics. The brain should perform about 300 billion times slower than our palmtop computer!

The brain is using different logic and computational schemes than computers and it is using a lot of special-purpose, noise-based operations, not like our general-purpose computers. It must do that because its "brute force" abilities are much weaker than that of a laptop. Try to multiply the first 100 integer numbers and check how long does it take. For a laptop computer, it takes less than a millisecond.

The brain is using different logic and computational schemes than computers and it is using a lot of special-purpose, noise-based operations, not like our general-purpose computers. It must do that because its "brute force" abilities are much weaker than that of a laptop. Or try to memorize this text: vyicshrgoeeiakcledmntsstnaoatii

The brain is using different logic and computational schemes than computers and it is using a lot of special-purpose, noise-based operations, not like our general-purpose computers. It must do that because its "brute force" abilities are much weaker than that of a laptop. Or try to memorize this text: vyicshrgoeeiakcledmntsstnaoatii Michel Dyakonov is a great scientist For the brain, the second version is much easier, while for a computer, it does not matter; more precisely, the first version is easier because of the lack of blank characters.

Another major difference between the brain and a computers: 010101010101010101010101010101010101010101010101010101010101010101010101010 010101010101010101010101010101010100010101010101010101010101010101010101010 To discover the difference between the two string the computer must compare each bit in the two strings. The brain does not have to do that. At proper conditions very easily discovers difference of patterns without detailed investigations of all the details.

Noise-based logic

What is Noise-based logic: Noise carries the logic information. The logic base, which is a reference signal system, consists of uncorrelated (orthogonal) stochastic signals (noises). These are orthogonal vectors. Superpositions are possible: vector space. This reference system is needed to identify these vectors in a deterministic way. Deterministic logic. What noise-based logic is certainly not: It is not noise-assisted signal transfer, for example: It is not stochastic resonance It is not dithering It is not linearization by noise None of these schemes use the noise as information carrier. Note: because noise-based logic is deterministic logic: It is not stochastic computing It is not randomized algorithm but it can be a natural hardware for that

How does a noise-based logic hardware look like? Generic noise-based logic outline logic information (noise) Noise-based Logic Gate logic information (noise) Noise-based Logic Gate logic information (noise) reference signals (noises) reference signals (noises) Reference Noise System orthogonal stochastic time functions

1. Logic signals are noises that are orthogonal on the noise. Base: N orthogonal noises: noise-bits. 2. Multivalued logic. Noise-bit-1 Background Noise Noise-bit-2 3. Superpositions. N noise-bits. N bits simultaneously in a single wire. 4. Hyperspace vectors. Product of two or more different base noises: orthogonal to each base noise. Their superpositions represent 2 N bits simultaneously in a single wire. Quantum computers: N qubits represents 2 N classical bits (Note: sinusoidal functions can also do this, see below, but there is a price)

But, periodic functions, like sinusoidals can also do this! Why noise???

But why noise? At least three major aspects of noise compared to periodic: - Physics: Entropy production (energy dissipation): Simple wording: noise is freely available; generated by the system without power requirement. Deeper: Brillouin's negentropy law. The deterministic signal has negative entropy (negentropy) due to its information entropy I s (logarithm of amplitude resolution; reduced relative uncertainty). Due to the Second Law of Thermodynamics, the entropy of the whole closed system cannot decrease thus, at least, the same amount if positive entropy (in this case, heat) will be produced. If a resonator circuit is used on the oscillator, this heat production will be repeated within the passive relaxation time (Q-times the period) of the resonator thus a continuous heating power will be generated: Brillouin P heat = TS s /τ τ 1 kti s ln(2) In a resonator-free oscillator the situation is worse because the same heat is produced at each period of oscillation, which means the dissipation is Q-times higher. - Resilience of distinguishability of time series, compare periodic/stochastic. - Computational complexity at certain (quantum-mimics) special-purpose operations.

Example - 1 for entropy generation: Correlator-based noise-based logic

Basic structure of correlator-based noise-based logic with continuum noises: Theoretically much less power dissipation. But that needs special devices (may not exist yet). Slower: longer time. L.B. Kish, Physics Letters A 373 (2009) 911-918 Reference (base) noises Reference (base) noises Input signal (noise) Input stage: DC Logic units DC Output stage: Output signal (noise) Correlators DC (fast errors) Analog switches These two units can together be realized by a system of analog switches Note: analog circuitry but digital accuracy due to the threshold operation in the DC part!

Example: XOR gate comparing two logic vectors in a space of arbitrary dimensions (binary, multivalue, etc), with binary output giving "True" value only when the two input vectors are orthogonal. Even though the equation contains four multiplications, two saturation nonlinearities, one inverter, and two time averaging, the hardware realization is much simpler. It requires only one multiplier, one averaging unit and two analog switches. Realizations of the other gates also turns out to me simpler than their mathematical equations. Y(t) = X 1 (t)x 2 (t) H(t)+ X 1 (t)x 2 (t) L(t) LK, Physics Letters A 373 (2009) 911-918 Analog circuitry but digital accuracy! Analog switch, follower L(t) "False" U L,U H X 1 (t) (Inputs) X 2 (t) Analog Multiplier X Time average R C Y(t) (Output) Theoretically much less power dissipation. But that needs special devices (may not exist yet). Slower: longer time. The real potential would be due to multivalued aspects. Analog switch, inverter U L,U H H(t) "True"

Examples - 2 for resilience: Instantaneous noise-based logic: a) Brain: Random unipolar spike sequence based noise-based logic S.M. Bezrukov, L.B. Kish, "Deterministic multivalued logic scheme for information processing and routing in the brain", Physics Letters A 373 (2009) 2338-234. Z. Gingl, S. Khatri, L.B. Kish, "Towards brain-inspired computing", Fluctuation and Noise Letters 9 (2010) 403-412 b) With random telegraph waves: Boolean L.B. Kish, S. Khatri, F. Peper, "Instantaneous noise-based logic", Fluctuation and Noise Letters 9 (2010) 323-330 F. Peper, L.B. Kish, "Instantaneous, non-squeezed, noise-based logic", Fluctuation and Noise Letters 10 (2011) 231-237. c) With random telegraph waves: String verification L.B. Kish, S. Khatri, T. Horvath, "Computation using Noise-based Logic: Efficient String Verification over a Slow Communication Channel", European Journal of Physics B 79 (2011) 85-90. d) With random telegraph waves: product strings in superposition (quantum-mimic) H. Wen, L.B. Kish, A. Klappenecker, "Complex Noise-Bits and Large-Scale Instantaneous Parallel Operations with Low Complexity", Fluctuation and Noise Letters 12 (2013) 1350002.

Introducing the neuro-bit S.M. Bezrukov, L.B. Kish, Physics Letters A 373 (2009) 2338-2342 Neural spikes. Using set-theoretical operations. The A and B sets below represent two partially overlapping neural spike trains. AB is the overlapping part. AB is the spike train A minus the overlapping spikes. AB is the spike train B minus the overlapping spikes. AB A B A AB B AB AB, AB and AB are orthogonal, they do not have common part! The partially overlapping spike trains can be use as neuro-bits in the same was as it was with the noise bits. N neuro bits will make 2 N -1 orthogonal elements, that is a 2 N -1 dimensional hyperspace. The very same multidimensonal hyperspace as it was obtained with the coninuoum noise-bits. ( Kish, Khatri, Sethuraman, Physics Letters A 373 (2009) 1928-1934)

S.M. Bezrukov, L.B. Kish, Physics Letters A 373 (2009) 2338-2342 A B

S.M. Bezrukov, L.B. Kish, Physics Letters A 373 (2009) 2338-2342 A B AB

S.M. Bezrukov, L.B. Kish, Physics Letters A 373 (2009) 2338-2342 A B AB AB AB

With 3 neuro-bits (N=3). It makes 2 N -1 = 7 hyperspace vectors. Using these vectors in a binary (on/off) superposition, we can represent 127 different logic values in a synthesized neural spike train in a single line. (Bezrukov, Kish, Physics Letters A 373 (2009) 2338-2342 ) B 1,0,0 = AB C, A 0,1,0 = A BC 0,0,1 = A B C 1,1,0 = ABC C 1,0,1 = AB C 0,1,1 = A BC The very same hyperspace as it was obtained with the noise-bits. ( Kish, Khatri, Sethuraman, Physics Letters A 373 (2009) 1928-1934) 1,1,1 = ABC

A key question: can we make these set theoretical operations with neurons? Yes: the key role of inhibitory input of neurons becomes clear then. (Bezrukov, Kish, Physics Letters A 373 (2009) 2338-2342 ) N-th order orthogonator: N inputs for partially overlapping spike trains and 2 N -1 output ports with orthogonal spike trains. A(t) + _ B A A(t)B(t) -1 B AB A A B + _ A(t)B (t) AB -1 AB B(t) _ + A (t)b(t) AB AB The second-order orthogonator gate circuitry utilizing both excitatory (+) and inhibitory (-) synapses of neurons. The input points at the left are symbolized by circles and the output points at the right by freeending arrows. The arrows in the lines show the direction of signal propagation.

The orthon building element and its symbol. (Bezrukov, Kish, Physics Letters A 373 (2009) 2338-2342 ) A(t) B(t) + _ + _ A(t)B(t) A(t)B (t) A B + _ AB = 1,1 AB = 1,0

Resilience: Brain signal scheme utilizing stochastic neural spikes, their superpositions and coincidence detection A B AB AB AB Coincidence detector utilizing the reference (basis vector) signals. Very fast. No statistics/correlations are needed. S.M. Bezrukov, L.B. Kish, Physics Letters A 373 (2009) 2338-2342

Neural circuitry utilizing coincidences of neural spikes. The basic building element orthon (left) and its symbol (right). A(t) B(t) excitatory inhibitory + _ + _ A(t)B(t) A(t)B (t) Bezrukov, Kish, Physics Letters A 373 (2009) 2338-2342 A B + _ AB = 1,1 AB = 1,0

Examples - 3 for computational complexity: resilience example c: c) With random telegraph waves: String verification L.B. Kish, S. Khatri, T. Horvath, "Computation using Noise-based Logic: Efficient String Verification over a Slow Communication Channel", European Journal of Physics B 79 (2011) 85-90. resilience example d: Quantum mimicking d) Instantaneous NBL with random telegraph waves: product strings in superposition H. Wen, L.B. Kish, A. Klappenecker, "Complex Noise-Bits and Large-Scale Instantaneous Parallel Operations with Low Complexity", Fluctuation and Noise Letters 12 (2013) 1350002. Hyperspace-based instantaneous noise-based logic

Instantaneous NBL. Example: Random Telegraph Waves. Their products: hyperspace +1 Random Telegraph Wave (RTW) taking +1 or -1 with 50% probability at the beginning of each clock period. -1 RTW 2 =1 ; RTW 1 *RTW 2 = RTW 3 all orthogonal Advantage: can be realized with digital circuitry and physical random number generators. Free of parasitic errors. Arbitrary N-long bit strings can be represented by 2N independent waves; 2 waves for each bit, to represent the 2 possible values V 2 0, V 1 1 ; V 20, V 2 1 ; V 3 0, V 31 ; V 4 0, V 4 1 ; V 50, V 5 1 ; V 6 0, V 6 1 The actual string is represented by the product of the N waves that correspond to the bit values, for example: 1 0 1 1 0 1 V 11 * V 20 * V 3 1 * V 4 1 * V 50 * V 6 1 = Y 1 1 0 0 1 0 1 V 11 * V 20 * V 3 0 * V 4 1 * V 50 * V 6 1 = Y 2 Application example: string (-difference) verification with low communication complexity. The probability that Y 1 and Y 2 go together for M steps is 0.5 M. 83 time steps result in for less than 10-25 error probability L.B. Kish, S. Khatri, T. Horvath, "Computation using Noise-based Logic: Efficient String Verification over a Slow Communication Channel", Eur. J. Phys. B 79 (2011) 85-90 In the brain with unipolar spikes, XOR operations do the same.

Instantaneous NBL. Example: Random Telegraph Waves. Their products: hyperspace +1 Random Telegraph Wave (RTW) taking +1 or -1 with 50% probability at the beginning of each clock period. -1 RTW 2 =1 ; RTW 1 *RTW 2 = RTW 3 all orthogonal Note: simplified to save presentation time. Modified timing and/or complex waves are needed for best performance When the binary values of a bit are represented by waves V 0 and V 1 then the NOT operator is multiplication by V 0 *V 1 proof: (V 0 *V 1 )*V 1 =V 0 (V 0 *V 1 )*V 0 =V 1

+1-1 Random Telegraph Wave (RTW) taking +1 or -1 with 50% probability at the beginning of each clock period. RTW 2 =1 ; RTW 1 *RTW 2 = RTW 3 all orthogonal (V 0 *V 1 )*V 1 =V 0 (V 0 *V 1 )*V 0 =V 1 Universe: superposition of all the possible product strings Y N = V 0 1 (t) +V 1 1 (t) V 0 (t) +V 1 2 2 (t)... V 0 (t) +V 1 N N (t) Example-2: Large, parallel operations in hyperspace Single wire V 1 1 V 2 0 V 3 0 = 1,0,0 V 1 0 V 2 0 V 3 0 = 0,0,0 The second noise-bit in the superposition of 2 N binary numbers is inverted by an O(N 0 ) hardware complexity class operation! ( V 0 *V 1)* 2 2 V 1 1 V 1 2 V 0 3 = 1,1,0 V 1 1 V 0 2 V 1 3 = 1,0,1 V 0 1 V 1 2 V 0 3 = 0,1,0 V 0 1 V 0 2 V 1 3 = 0,0,1 V 1 1 V 2 1 V 3 1 = 1,1,1 V 1 0 V 2 1 V 3 1 = 0,1,1

+1-1 Random Telegraph Wave (RTW) taking +1 or -1 with 50% probability at the beginning of each clock period. RTW 2 =1 ; RTW 1 *RTW 2 = RTW 3 all orthogonal (V 0 *V 1 )*V 1 =V 0 (V 0 *V 1 )*V 0 =V 1 Example-2: Large, parallel operations in hyperspace Can be done with sinusoidal signals, too! Isn't that better? Then a Fourierseries analysis over the base period would serve with the full result! Single wire V 1 1 V 0 2 V 0 3 = 1,0,0 V 0 1 V 0 2 V 0 3 = 0,0,0 V 1 1 V 1 2 V 0 3 = 1,1,0 V 1 1 V 2 0 V 3 1 = 1,0,1 V 1 0 V 2 1 V 3 0 = 0,1,0 V 1 0 V 2 0 V 3 1 = 0,0,1 V 1 1 V 2 1 V 3 1 = 1,1,1 V 1 0 V 2 1 V 3 1 = 0,1,1

The signal system with sinusoidals: Linear vs Exponential harmonic (sinusoidal) bases: L r (t) = e j 2π (2r 1) f 0t H r (t) = e j 2π 2rf 0t Bit 1st Logic Value Linear Representation Frequency Exponential Representation L 1 f 0 f 0 H 1 2f 0 2f 0 L r (t) = e j 2π 22 r 2 f 0 t H r (t) = e j 2π 22 r 1 f 0 t 2nd L 2 3f 0 4f 0 H 2 4f 0 8f 0............ L N Nth (2N-1)f 0 2 2N-2 f 0 Time complexity: f max /f min Hyperspace (product) vector, N X r r=1 time complexity: Degenerate OK example: L 1 H 2 =H 1 L 2 O(N 2 ) O(2 2 N ) H N 2Nf 0 2 2N-1 f 0

Conclusions (why noise) Orthogonal noises form a freely available logic signal system (e.g. N resistors). In the brain logic scheme noise provides extraordinary resilience compared to periodic spikes. In (quantum-mimic) setting up the instantaneous hyperspace a sinusoidal hyperspace requires O(2 2N ) time complexity while the RTW-based scheme O(1) The FFT analysis of the sinusoidal hyperspace vector requires O(2 2N ) time complexity while the corresponding analysis of the RTW based ones will require only an O(N) time complexity (Stacho, 2012). And a lot of open questions, including: Tamas Horvath (Fraunhofer, IAIS, Germany): "The connection between the expressive power of NBL and that of probabilistic Turing machines is an interesting open question for further research."

end of talk