Mixed Signal IC Design Notes set 6: Mathematics of Electrical Noise

Similar documents
ECE594I Notes set 4: More Math: Expectations of 1-2 R.V.'s

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

TLT-5200/5206 COMMUNICATION THEORY, Exercise 3, Fall TLT-5200/5206 COMMUNICATION THEORY, Exercise 3, Fall Problem 1.

Signals & Linear Systems Analysis Chapter 2&3, Part II

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Analog Computing Technique

10. Joint Moments and Joint Characteristic Functions

Signals and Spectra - Review

Analog Communication (10EC53)

ENSC327 Communications Systems 2: Fourier Representations. School of Engineering Science Simon Fraser University

Figure 3.1 Effect on frequency spectrum of increasing period T 0. Consider the amplitude spectrum of a periodic waveform as shown in Figure 3.2.

3. Several Random Variables

Probability, Statistics, and Reliability for Engineers and Scientists MULTIPLE RANDOM VARIABLES

2A1H Time-Frequency Analysis II

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Lecture 13: Applications of Fourier transforms (Recipes, Chapter 13)

Joint ] X 5) P[ 6) P[ (, ) = y 2. x 1. , y. , ( x, y ) 2, (

In many diverse fields physical data is collected or analysed as Fourier components.

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Ex x xf xdx. Ex+ a = x+ a f x dx= xf x dx+ a f xdx= xˆ. E H x H x H x f x dx ˆ ( ) ( ) ( ) μ is actually the first moment of the random ( )

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

System Identification & Parameter Estimation

Additional exercises in Stationary Stochastic Processes

( x) f = where P and Q are polynomials.

9.3 Graphing Functions by Plotting Points, The Domain and Range of Functions

References Ideal Nyquist Channel and Raised Cosine Spectrum Chapter 4.5, 4.11, S. Haykin, Communication Systems, Wiley.

3F1 Random Processes Examples Paper (for all 6 lectures)

Introduction to Analog And Digital Communications

Systems & Signals 315

2 Frequency-Domain Analysis

Asymptote. 2 Problems 2 Methods

Review of Fourier Transform

Analysis and Design of Analog Integrated Circuits Lecture 14. Noise Spectral Analysis for Circuit Elements

Mathematics of Electrical Noise

16.584: Random (Stochastic) Processes

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Problems on Discrete & Continuous R.Vs

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Review of Prerequisite Skills for Unit # 2 (Derivatives) U2L2: Sec.2.1 The Derivative Function

Philadelphia University Faculty of Engineering Communication and Electronics Engineering

Review of Probability

SIO 211B, Rudnick. We start with a definition of the Fourier transform! ĝ f of a time series! ( )

Chapter 8: MULTIPLE CONTINUOUS RANDOM VARIABLES

Two-dimensional Random Vectors

Chapter 6. Random Processes

Basic mathematics of economic models. 3. Maximization

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

Objectives. By the time the student is finished with this section of the workbook, he/she should be able

2. ETA EVALUATIONS USING WEBER FUNCTIONS. Introduction

pickup from external sources unwanted feedback RF interference from system or elsewhere, power supply fluctuations ground currents

Fundamentals of Noise

MODULE 6 LECTURE NOTES 1 REVIEW OF PROBABILITY THEORY. Most water resources decision problems face the risk of uncertainty mainly because of the

Signals and Spectra (1A) Young Won Lim 11/26/12

Research Article. Spectral Properties of Chaotic Signals Generated by the Bernoulli Map

9.1 The Square Root Function

Solutions to Problems in Chapter 4

Feedback Linearization

8.4 Inverse Functions

ENERGY ANALYSIS: CLOSED SYSTEM

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Probability and Statistics for Final Year Engineering Students

Differentiation. The main problem of differential calculus deals with finding the slope of the tangent line at a point on a curve.

RATIONAL FUNCTIONS. Finding Asymptotes..347 The Domain Finding Intercepts Graphing Rational Functions

Lecture 25: Heat and The 1st Law of Thermodynamics Prof. WAN, Xin

Lab on Taylor Polynomials. This Lab is accompanied by an Answer Sheet that you are to complete and turn in to your instructor.

Categories and Natural Transformations

ECE 145A / 218 C, notes set 13: Very Short Summary of Noise

Lecture - 30 Stationary Processes

1036: Probability & Statistics

Estimation and detection of a periodic signal

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL of ELECTRICAL & COMPUTER ENGINEERING FINAL EXAM. COURSE: ECE 3084A (Prof. Michaels)

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

AH 2700A. Attenuator Pair Ratio for C vs Frequency. Option-E 50 Hz-20 khz Ultra-precision Capacitance/Loss Bridge

Stochastic Processes

Intro to Stochastic Systems (Spring 16) Lecture 6

ECE-340, Spring 2015 Review Questions

Stochastic Structural Dynamics Prof. Dr. C. S. Manohar Department of Civil Engineering Indian Institute of Science, Bangalore

Chapter 5 Random Variables and Processes

UNIVERSITY OF CALIFORNIA - SANTA CRUZ DEPARTMENT OF PHYSICS PHYS 112. Homework #4. Benjamin Stahl. February 2, 2015

Math Review and Lessons in Calculus

Chapter 1. Statistics of Waves

ECE 636: Systems identification

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Least-Squares Spectral Analysis Theory Summary

On the 1/f noise in ultra-stable quartz oscillators

Conference Article. Spectral Properties of Chaotic Signals Generated by the Bernoulli Map

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

) t 0(q+ t ) dt n t( t) dt ( rre i dq t 0 u = = t l C t) t) a i( ( q tric c le E

Decibels, Filters, and Bode Plots

This is the 15th lecture of this course in which we begin a new topic, Excess Carriers. This topic will be covered in two lectures.

Physics 5153 Classical Mechanics. Solution by Quadrature-1

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

ECE594I Notes set 2: Math

Revision of Lecture 4

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

CHAPTER 1: INTRODUCTION. 1.1 Inverse Theory: What It Is and What It Does

Intrinsic Small-Signal Equivalent Circuit of GaAs MESFET s

Electrical Noise under the Fluctuation-Dissipation framework

Transcription:

ECE45C /8C notes, M. odwell, copyrighted 007 Mied Signal IC Design Notes set 6: Mathematics o Electrical Noise Mark odwell University o Caliornia, Santa Barbara rodwell@ece.ucsb.edu 805-893-344, 805-893-36 a

Strategy ECE45C /8C notes, M. odwell, copyrighted 007 There is not time in45c/8c to develop this subject in detail. Strategy : give backround suicient or correct calculation o SN, spectral densities, correlation unctions, signal correlations, error rates. More detail can be ound in my noise class notes (on the web), or in the literature. an der Zeil's book iscomprehensive.

Topics ECE45C /8C notes, M. odwell, copyrighted 007 Math: distributions, random variables, epectations, pairs o, joint distributions, covariance and correlations. andom processes, stationarity, ergodicity, correlation unctions, autocorrelation unction, power spectral density. Noise models o devices: thermal and shot noise. Models o resistors, diodes, transitors, antennas. Circuit noise analysis: network representation. Solution. Total output noise. Total input noise. generator model. En/In model. Noise igure, noise temperature. Signal / noise ratio.

random variables ECE45C /8C notes, M. odwell, copyrighted 007

The irst step: random ariables ECE45C /8C notes, M. odwell, copyrighted 007 During an eperiment, a random variable The probability that lies between P{ < < } ( ) d and ( ) is the probability distribution unction. () takes on a particular is value.

Eample: The Gaussian Distribution ECE45C /8C notes, M. odwell, copyrighted 007 The Gaussian distribution : ( ) πσ ep ( ) σ We will deine shortly the mean ( ) and the standard deviation ( σ ). Because o the * central limit theorem*, physical random processes arising rom the sum o many small eects have probability distributions close to that o the Gaussian. () ~σ

Mean values and epectations ECE45C /8C notes, M. odwell, copyrighted 007 Epectation o a unction g() E [ g( ) ] + g( ) ( ) d o the random variable Mean alue o E [ ] + ( ) d Epected value o E + [ ] ( ) d

ariance ECE45C /8C notes, M. odwell, copyrighted 007 The variance σ o rom itsaverage value σ is its root ( ) E ( ) [ ] ( ) - mean - square deviation + ( ) d The standard deviation σ the square root o the variance o is simply

eturning to the Gaussian Distribution ECE45C /8C notes, M. odwell, copyrighted 007 The notation describing the Gaussian distribution : ( ) should now be clear. ( ) ep πσ σ () ~σ

ECE45C /8C notes, M. odwell, copyrighted 007 ariance vs Epectation o the Square ( ) ( )( ) ( ) ( ) ( ) ( ) the epectation. minus the square o the square the epectation o variance is The + + + σ σ

Pairs o andom ariables ECE45C /8C notes, M. odwell, copyrighted 007 To understand random processes, we must irst understand pairs o random variables. In an eperiment, a pair o random variables takes on speciic particular values and y. and Y Their joint behavior is described by the joint distribution Y (, y) P{ A < < B and C < y < D} D C B A Y (, y) ddy

Pairs o andom ariables ECE45C /8C notes, M. odwell, copyrighted 007 Marginal distributions must also be deined P{ A < < B } + B A B A Y (, ( ) d y) ddy and similarly or Y : P{ C < y < D } D + C D C Y Y (, ( y) dy y) ddy

Statistical Independence ECE45C /8C notes, M. odwell, copyrighted 007 In the case where Y (, y) ( ) Y ( y), the variables are said to be statistically independent. Thisisnot generally epected.

Epectations o a pair o random variables ECE45C /8C notes, M. odwell, copyrighted 007 The epectation o a unction g(, Y) E [ g(, y) ] + + g(, y) Y (, y) ddy o the random variables Y and Y is Epectation o : E [ ] + + Y (, y) ddy + ( ) d Epectation o E [ ] + + Y (, y) ddy + ( ) d...and similarly or Y and Y.

Correlation between random variables ECE45C /8C notes, M. odwell, copyrighted 007 The correlation o and Y is Y [ Y ] + + E y Y (, y) ddy The covariance o and Y is C Y E [( )( Y y) ] E[ Y Y y + y] Y y Note that correlation and covariance are or Y have zero mean values. the same i either

Correlation versus Covariance ECE45C /8C notes, M. odwell, copyrighted 007 When we are working with voltages and currents, we usually separate the mean value (DC bias) rom the time - varying component. The random variables then have zero mean. Correlation is then equal to covariance. It is thereore common in circuit noise analysis touse the two termsinterchangably. But, nonzero mean values can return conditional distributions. when we e.g. calculate Be careul.

Correlation Coeicient ECE45C /8C notes, M. odwell, copyrighted 007 The correlation coeicient o ρ Y CY / σ σy and Y is Note the (standard) conusion in terminology between correlation and covariance.

Sum o TWO andom ariables ECE45C /8C notes, M. odwell, copyrighted 007 Sum o two random variables : Z + Y E I E [ ] [ ] [ Z E ( + Y ) E + Y + Y ] [ ] [ E + E Y ] + and Y both have zero means [ ] [ ] [ Z E + E Y ] + C Y Y This emphasizes the role o correlation.

ECE45C /8C notes, M. odwell, copyrighted 007 Pairs o Jointly Gaussian andom ariables [ ] [ ] j i i i i n Y Y Y Y Y Y E E y y y y y and covariances, variances, means which isspeciied by a set o ),,, ( we can have a Jointly Gaussian random vector In general, variables. to a larger # o This deinition can be etended ) ( ) )( ( ) ( ) ( ep ), ( and Y are Jointly Gaussian : I + + σ σ σ σ ρ ρ σ πσ

Linear Operations on JG's ECE45C /8C notes, M. odwell, copyrighted 007 I and Y are Jointly Gaussian, and i we deine a + by and W c + dy Then and W are also Jointly Gaussian. This is stated without proo; the result arises convolution o Gaussian unctions produces a Gaussian unction. because The result holds or JGs o any number.

ECE45C /8C notes, M. odwell, copyrighted 007 Probability distribution ater a Linear Operation on JG's E σ σ C W W [ ] E[ a + by ] E E W [ ] [ ] [ a E + b E Y ] + ab E[ Y ] [ ] [ ] + [ W c E d E Y ] + cd E[ Y ] [ ] W E[ ( a + by )( c + dy )] E W E ac a + by and W [ + ( ad + bc) Y + bdy ] W [ ] + + [ ] + [ ace ( ad bc) E Y bd E Y ] c W + ( a dy ( a ( c + by ) + by )( c + dy ) + dy ) tedious details We can now calculate the joint distribution o and W. W ( v, w) πσ σ W ρ W ep ( ρ W ) ( v v) σ + ( v v)( w σ σ W w) + ( w w) σ W

Why are JG's Important? ECE45C /8C notes, M. odwell, copyrighted 007 The math on the last slide was tedious but there is a clear conclusion : With JG' s subjected o to linear operations, means, correlations, and variances. it is suicient to keep track With this inormation, distribution unctions can always be simply ound. This vastly simpliiescalculations o (linear circuits). noise propagation inlinear systems

Uncorrelated ariables. ECE45C /8C notes, M. odwell, copyrighted 007 Uncorrelated : C Y 0 Statistically independent Y (, y) ( ) Y ( y) : Independence implieszero correlation. Zero correlation does not imply independence. For JG' s, uncorrelated does imply independence

ECE45C /8C notes, M. odwell, copyrighted 007 Summing o Noise (andom) oltages [ ] the random noise voltages do add. time values o The instantaneous - a correllation term must be included. the two random generators do not add - The noise powers o ) ( The power dissipated in the resistor isa random variable P Two voltages are applied to the resistor C P P E + + + + + + + + + + + σ σ ρ σ

Shot Noise as a andom ariable ECE45C /8C notes, M. odwell, copyrighted 007 The iber has transmission probability p. Send one photon, and call the # o received photons N. E [ ] [ ] and so [ N N p E N p σ E N ] N N p p I we now send many photons ( M o them), transmission o each is statistically independent, so - - - calling the # o received photons N, E [ N ] M E[ N ] Mp and σ N M σ N M ( p p ) Now suppose M >>, p <<, and Mp σ N N The variance o the count approaches >>, the mean value o the count.

Thermal Noise as a andom ariable ECE45C /8C notes, M. odwell, copyrighted 007 A capacitor C isconnected to a resistor. The resistor isin equilibrium with a " reservoir" (a warm room) at temperature T can echange energy with the room in the orm o heat. C can dissipate no power : it establishes thermal equilibrium with the room via the resistor. From thermodynamics, any independent degree o at temperature T has mean energy kt/, hence E kt / reedom o a system C / kt / kt / C The noise voltage has variance kt/c.

random processes ECE45C /8C notes, M. odwell, copyrighted 007

andom Processes ECE45C /8C notes, M. odwell, copyrighted 007 Draw a set o graphs, on separate sheets o unctions o voltage vs. time. o paper, Put them into a garbage can. This garbage can iscalled the probability sample space. Pick out one sheet at random. Thisisour random unction o time. The random process is ( t). The particular outcome is v(t)

Time Averages vs. Sample Space Averages ECE45C /8C notes, M. odwell, copyrighted 007 ecall the deinion o the epectation o a unction g() o a random variable E E [ g( ) ] [ g( v( t ))] + g is the *average value* o g, where the average is over the sample space. With our random process deinition, we can deine an average over the [ ( v( t)) ] A g g( ) + + g( v( t g( v( t)) dt ( ) d g sample space at some particular ( v( t time t )) d( v( t We can also deine an average o the unction over time : )) )) :

Ergodic andom Processes ECE45C /8C notes, M. odwell, copyrighted 007 An Ergodic random process has averages over time equal to averages over the statistical sample space [ g( v( t ))] A[ g( v( ))] E t In some sense, we have made " random variation with time" equivalent to " random variation over the sample space"

Time Samples o andom Processes ECE45C /8C notes, M. odwell, copyrighted 007 With time samples at times t the random process ( t) has and t values ( t ) and ( t ). ( t) and ( t) have some joint probability distribution. They might (or might not) be jointly Gaussian. v(t) v(t ) v(t ) t t t

andom Waveorms are andom ectors ECE45C /8C notes, M. odwell, copyrighted 007 Using Nyquist's sampling theorem, i a random signal is bandlimited, and i we pick regularly - spaced time samples t... t n we convert our random process intoa random vector., We can thus analyze random signals using vector analysis and geometry. This is mostly beyond the scope o this class. v(t) t t t n t

Stationary andom Processes ECE45C /8C notes, M. odwell, copyrighted 007 The statistics o a stationary process do not vary with time. N E th order stationarity : [ ( ( t ), ( t ),..., ( t ))] E[ ( ( t + τ ), ( t + τ ),..., ( t + τ ))]..and lower orders n n E nd order stationarity : [ ( ( t), ( t) )] E[ ( ( t + τ ), ( t + τ ))] lower orders E[ ( ( t ))] E[ ( ( t + τ ))] v(t) v(t ) v(t ) t t t

ECE45C /8C notes, M. odwell, copyrighted 007 estrictions on the random processes we consider We will make ollowing restrictions to make analysis tractable: The process will be Ergodic. The process will be stationary to any order: all statistical properties are independent o time. Many common processes are not stationary, including integrated white noise and / noise. The process will be Jointly Gaussian. This means that i the values o a random process (t) are sampled at times t, t, etc, to orm random variables (t ), etc, then,, etc. are a jointly Gaussian random variable. In nature, many random processes result rom the sum o a vast number o small underlying random processses. From the central limit theorem, such processes can requently be epected to be Jointly Gaussian.

ariation o a random process with time ECE45C /8C notes, M. odwell, copyrighted 007 For the random process (t), look at (t ) and (t ). + + E (, ) d [ ] d To compute this we need to know the joint probability distribution. We have assumed a Gaussian process. The above is called the Autocorrellation unction. IF the process is stationary, it is a unction only o (t -t )tau, and hence [ ( )] ( τ) E ( t) t + τ this is the autocorrellation unction. It describes how rapidly a random voltage varies with time. PLEASE recall we are assuming zero-mean random processes (DC bias subtracted). Thus the autocorrellation and the auto-covariance are the same

ariation o a random process with time ECE45C /8C notes, M. odwell, copyrighted 007 Note that ( 0) E[ ( t)( t) ] σ gives the variance o the random process. The autocorrelation unction gives us variance o the random process and the correlation between its values or two moments in time. I the process is Gaussian, this is enough to completely describe the process. (τ ) Narrow autocorrelation: Fast variation Broad autocorrelation Slow variation (τ ) τ τ

ECE45C /8C notes, M. odwell, copyrighted 007 Autocorrelation is an Estimate o the ariation with Time I random variables and Y are Jointly Gaussian, and have zero mean, then knowledge o the value y o the outome o Y results in a best estimate o as ollows: E[ Y y] Y y Y σ Y y "The epected value o the random variable, given that the random variable Y has value y is..." Hence, the autocorrellation unction tells us the degree to which the signal at time t is related to the signal at time t + τ A narrow autocorrelation is indicative o a quickly-varying random process

Power spectral densities ECE45C /8C notes, M. odwell, copyrighted 007 The autocorrellation unction describes how a random process evolves with time. Find its Fourier transorm: + S ( ω) ( τ ) ep( jωτ ) dτ This is called the power spectral density o the signal. emembering the usual Fourier transorm relationships, i the power spectrum is broad, the autocorrellation unction is narrow, and the signal varies rapidly--it has content at high requencies, and the voltages o any two points are strongly related only i the two points are close together in time. I the power spectrum is narrow, the autocorrellation unction is broad, and the signal varies slowly--it has content only at low requencies and the voltages o any two points are strongly related unless i the two points are broadly separated in time.

Power spectral densities ECE45C /8C notes, M. odwell, copyrighted 007 S(omega) (tau) (t) omega tau time S(omega) (tau) (t) omega tau time

Power Spectral Densities ECE45C /8C notes, M. odwell, copyrighted 007 ecall the Fourier transorm o S ( ω) ( τ ) The inverse transorm holds, so that ( τ ) S ( ω) speciically, (0) So, i σ that the power spectral + π σ + π is called the power spectral ep( + S jωτ ) dτ ep( jωτ ) dω ( ω) the power in the process, density will density is the autocorrelation unction dω give us the power. then integrating This is the justiication or the term," power spectral density"

Correlated andom Processes ECE45C /8C notes, M. odwell, copyrighted 007 Two processes can be statistically related. Consider two random processes (t) and Y(t). Deine the cross - correllation unction o Y ( τ ) E[ ( t) Y ( t + τ )] the processes They will have a cross - spectral density as ollows: S Y + ( ω) ( τ ) and thereore Y Y ep( π jωτ ) dτ + ( τ ) S ( ω) ep( jωτ ) dω Y

Single-Sided Hz-based Spectral Densities ECE45C /8C notes, M. odwell, copyrighted 007 Double - Sided Spectral Densities S ( τ ) E[ ( t) ( t + τ )] S ( jω) + ( jω) ( τ ) ep( π + jωτ ) dτ ep( jωτ ) dω Single -Sided Hz - based Spectral Densities ~ S ( τ ) E[ ( t) ( t + τ )] S ( j ) + ( j ) ( τ ) ep( jπτ ) dτ + ~ ep( jπτ ) d

ECE45C /8C notes, M. odwell, copyrighted 007 Single-Sided Hz-based Spectral Densities- Why? Why this notation? The signal Power ~ S per Hz ( j ) o power in the bandwidth high signal {, } ( j ) d + S ( j ) d S ( j ) is directly the Watts o bandwidth signal at requencies lying close to the requency low low high high low ~ ~ S high low power. ~ d

ECE45C /8C notes, M. odwell, copyrighted 007 Single-Sided Hz-based Cross Spectral Densities Double - Sided Cross Spectral Densities S ~ S Y Y Y Y ( τ ) E[ ( t) Y ( t + τ )] S ( jω) + ( jω) ( τ ) Y ( τ ) E[ ( t) Y ( t + τ )] S ( j ) ( j ) ( τ ) Y ep( ep( π jωτ ) dτ Single -Sided Hz - based Cross Spectral Densities + + + ~ Y Y jπτ ) dτ ep( jωτ ) dω ep( jπτ ) d ~ S Y d d ( j ) isalso oten written as Y

Eample: Cross Spectral Densities ECE45C /8C notes, M. odwell, copyrighted 007 ( t) ( t) + Y ( t) ( τ ) E[ ( ( t) + Y ( t) )( ( t + τ ) + Y ( t + τ ))] ( τ ) + ( τ ) + ( τ ) + ( τ ) YY Y Y Y P / S * ( jω) S ( jω) + SYY ( jω) + SY ( jω) + SY ( jω) S ( jω) + S ( jω) + e{ S ( jω) } YY Y Or, in single - sided spectral ~ ~ ~ S { } ( j ) S ( j ) + S ( j ) + e S ( j ) YY densities ~ Y

Eample: Cross Spectral Densities ECE45C /8C notes, M. odwell, copyrighted 007 The Power [ ( t) ( t) / ] ( 0) E P ( t) / / has epected value Y P / And in the bandwidth between P high low ~ S ( j ) d... low and Integrating with respect to requency (over whatever bandwidth isrelevant) gives the total (epected) power dissipated in. high, Note that the cross - spectral density is relevant.

ECE45C /8C notes, M. odwell, copyrighted 007 Noise passing through ilters & linear electrical networks I the ilter has impulse response h( t) and transer unction H ( jω), then or any v in ( t) v out ( t), out ( jω) H ( jω) in ( jω) We can show S S out in in out ( ( jω) jω) H ( jω) S S in in in in ( jω) H ( jω) * ( jω) in(t) h(t) out(t) and S out out ( jω) H ( jω) S in in ( jω)